59 resultados para automatic test case generation
Resumo:
The double triangular test was introduced twenty years ago, and the purpose of this paper is to review applications that have been made since then. In fact, take-up of the method was rather slow until the late 1990s, but in recent years several clinical trial reports have been published describing its use in a wide range of therapeutic areas. The core of this paper is a detailed account of five trials that have been published since 2000 in which the method was applied to studies of pancreatic cancer, breast cancer, myocardial infarction, epilepsy and bedsores. Before those accounts are given, the method is described and the history behind its evolution is presented. The future potential of the method for sequential case-control and equivalence trials is also discussed. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
Many families of interspersed repetitive DNA elements, including human Alu and LINE (Long Interspersed Element) elements, have been proposed to have accumulated through repeated copying from a single source locus: the "master gene." The extent to which a master gene model is applicable has implications for the origin, evolution, and function of such sequences. One repetitive element family for which a convincing case for a master gene has been made is the rodent ID (identifier) elements. Here we devise a new test of the master gene model and use it to show that mouse ID element sequences are not compatible with a strict master gene model. We suggest that a single master gene is rarely, if ever, likely to be responsible for the accumulation of any repeat family.
Resumo:
Inferring the spatial expansion dynamics of invading species from molecular data is notoriously difficult due to the complexity of the processes involved. For these demographic scenarios, genetic data obtained from highly variable markers may be profitably combined with specific sampling schemes and information from other sources using a Bayesian approach. The geographic range of the introduced toad Bufo marinus is still expanding in eastern and northern Australia, in each case from isolates established around 1960. A large amount of demographic and historical information is available on both expansion areas. In each area, samples were collected along a transect representing populations of different ages and genotyped at 10 microsatellite loci. Five demographic models of expansion, differing in the dispersal pattern for migrants and founders and in the number of founders, were considered. Because the demographic history is complex, we used an approximate Bayesian method, based on a rejection-regression algorithm. to formally test the relative likelihoods of the five models of expansion and to infer demographic parameters. A stepwise migration-foundation model with founder events was statistically better supported than other four models in both expansion areas. Posterior distributions supported different dynamics of expansion in the studied areas. Populations in the eastern expansion area have a lower stable effective population size and have been founded by a smaller number of individuals than those in the northern expansion area. Once demographically stabilized, populations exchange a substantial number of effective migrants per generation in both expansion areas, and such exchanges are larger in northern than in eastern Australia. The effective number of migrants appears to be considerably lower than that of founders in both expansion areas. We found our inferences to be relatively robust to various assumptions on marker. demographic, and historical features. The method presented here is the only robust, model-based method available so far, which allows inferring complex population dynamics over a short time scale. It also provides the basis for investigating the interplay between population dynamics, drift, and selection in invasive species.
Resumo:
Clients and contractors need to be aware of the project’s legal environment because the viability of a procurement strategy can be vitiated by legal rules. This is particularly true regarding Performance-Based Contracting (PBC) whose viability may be threatened by rules of property law: while the PBC concept does not require that the contractor transfers the ownership in the building materials used to the client, the rules of property law often lead to an automatic transfer of ownership. But does the legal environment really render PBC unfeasible? In particular, is PBC unfeasible because contractors lose their materials as assets? These questions need to be answered with respect to the applicable property law. As a case study, English property law has been chosen. Under English law, the rule which governs the automatic transfer of ownership is called quicquid plantatur solo, solo credit (whatever is fixed to the soil belongs to the soil). An analysis of this rule reveals that not all materials which are affixed to land become part of the land. This fate only occurs in relation to materials which have been affixed with the intention of permanently improving the land. Five fictitious PBC cases have been considered in terms of the legal status of the materials involved, and several subsequent legal questions have been addressed. The results suggest that English law does actually threaten the feasibility of PBC in some cases. However, it is also shown that the law provides means to circumvent the unwanted results which flow from the rules of property law. In particular, contractors who are interested in keeping their materials as assets can insist on agreeing a property right in the client’s land, i.e. a contractor’s lien. Therefore, the outcome is that English property law does not render the implementation of the PBC concept unfeasible. At a broader level, the results contribute to the theoretical framework of PBC as an increasingly used procurement strategy.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
This paper presents an efficient construction algorithm for obtaining sparse kernel density estimates based on a regression approach that directly optimizes model generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. A local regularization method is incorporated naturally into the density construction process to further enforce sparsity. An additional advantage of the proposed algorithm is that it is fully automatic and the user is not required to specify any criterion to terminate the density construction procedure. This is in contrast to an existing state-of-art kernel density estimation method using the support vector machine (SVM), where the user is required to specify some critical algorithm parameter. Several examples are included to demonstrate the ability of the proposed algorithm to effectively construct a very sparse kernel density estimate with comparable accuracy to that of the full sample optimized Parzen window density estimate. Our experimental results also demonstrate that the proposed algorithm compares favorably with the SVM method, in terms of both test accuracy and sparsity, for constructing kernel density estimates.
Resumo:
The associative sequence learning model proposes that the development of the mirror system depends on the same mechanisms of associative learning that mediate Pavlovian and instrumental conditioning. To test this model, two experiments used the reduction of automatic imitation through incompatible sensorimotor training to assess whether mirror system plasticity is sensitive to contingency (i.e., the extent to which activation of one representation predicts activation of another). In Experiment 1, residual automatic imitation was measured following incompatible training in which the action stimulus was a perfect predictor of the response (contingent) or not at all predictive of the response (noncontingent). A contingency effect was observed: There was less automatic imitation indicative of more learning in the contingent group. Experiment 2 replicated this contingency effect and showed that, as predicted by associative learning theory, it can be abolished by signaling trials in which the response occurs in the absence of an action stimulus. These findings support the view that mirror system development depends on associative learning and indicate that this learning is not purely Hebbian. If this is correct, associative learning theory could be used to explain, predict, and intervene in mirror system development.
Resumo:
Recent research in cognitive neuroscience has found that observation of human actions activates the ‘mirror system’ and provokes automatic imitation to a greater extent than observation of non-biological movements. The present study investigated whether this human bias depends primarily on phylogenetic or ontogenetic factors by examining the effects of sensorimotor experience on automatic imitation of non-biological robotic, stimuli. Automatic imitation of human and robotic action stimuli was assessed before and after training. During these test sessions, participants were required to execute a pre-specified response (e.g. to open their hand) while observing a human or robotic hand making a compatible (opening) or incompatible (closing) movement. During training, participants executed opening and closing hand actions while observing compatible (group CT) or incompatible movements (group IT) of a robotic hand. Compatible, but not incompatible, training increased automatic imitation of robotic stimuli (speed of responding on compatible trials, compared with incompatible trials) and abolished the human bias observed at pre-test. These findings suggest that the development of the mirror system depends on sensorimotor experience, and that, in our species, it is biased in favour of human action stimuli because these are more abundant than non-biological action stimuli in typical developmental environments.
Resumo:
Wind generation’s contribution to meeting extreme peaks in electricity demand is a key concern for the integration of wind power. In Great Britain (GB), robustly assessing this contribution directly from power system data (i.e. metered wind-supply and electricity demand) is difficult as extreme peaks occur infrequently (by definition) and measurement records are both short and inhomogeneous. Atmospheric circulation-typing combined with meteorological reanalysis data is proposed as a means to address some of these difficulties, motivated by a case study of the extreme peak demand events in January 2010. A preliminary investigation of the physical and statistical properties of these circulation types suggests that they can be used to identify the conditions that are most likely to be associated with extreme peak demand events. Three broad cases are highlighted as requiring further investigation. The high-over-Britain anticyclone is found to be generally associated with very low winds but relatively moderate temperatures (and therefore moderate peak demands, somewhat in contrast to the classic low-wind cold snap that is sometimes apparent in the literature). In contrast, both longitudinally extended blocking over Scotland/Scandinavia and latitudinally extended troughs over western Europe appear to be more closely linked to the very cold GB temperatures (usually associated with extreme peak demands). In both of these latter situations, wind resource averaged across GB appears to be more moderate.
Resumo:
The “case for property” in the mixed-asset portfolio is a topic of continuing interest to practitioners and academics. Such an analysis typically is performed over a fixed period of time and the optimum allocation to property inferred from the weight assigned to property through the use of mean-variance analysis. It is well known, however, that the parameters used in the portfolio analysis problem are unstable through time. Thus, the weight proposed for property in one period is unlikely to be that found in another. Consequently, in order to assess the case for property more thoroughly, the impact of property in the mixed-asset portfolio is evaluated on a rolling basis over a long period of time. In this way we test whether the inclusion of property significantly improves the performance of an existing equity/bond portfolio all of the time. The main findings are that the inclusion of direct property into an existing equity/bond portfolio leads to increase or decreases in return, depending on the relative performance of property compared with the other asset classes. However, including property in the mixed-asset portfolio always leads to reductions in portfolio risk. Consequently, adding property into an equity/bond portfolio can lead to significant increases in risk-adjusted performance. Thus, if the decision to include direct property in the mixed-asset portfolio is based upon its diversification benefits the answer is yes, there is a “case for property” all the time!
Resumo:
Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.
Resumo:
Models of root system growth emerged in the early 1970s, and were based on mathematical representations of root length distribution in soil. The last decade has seen the development of more complex architectural models and the use of computer-intensive approaches to study developmental and environmental processes in greater detail. There is a pressing need for predictive technologies that can integrate root system knowledge, scaling from molecular to ensembles of plants. This paper makes the case for more widespread use of simpler models of root systems based on continuous descriptions of their structure. A new theoretical framework is presented that describes the dynamics of root density distributions as a function of individual root developmental parameters such as rates of lateral root initiation, elongation, mortality, and gravitropsm. The simulations resulting from such equations can be performed most efficiently in discretized domains that deform as a result of growth, and that can be used to model the growth of many interacting root systems. The modelling principles described help to bridge the gap between continuum and architectural approaches, and enhance our understanding of the spatial development of root systems. Our simulations suggest that root systems develop in travelling wave patterns of meristems, revealing order in otherwise spatially complex and heterogeneous systems. Such knowledge should assist physiologists and geneticists to appreciate how meristem dynamics contribute to the pattern of growth and functioning of root systems in the field.
Resumo:
Various studies investigating the future impacts of integrating high levels of renewable energy make use of historical meteorological (met) station data to produce estimates of future generation. Hourly means of 10m horizontal wind are extrapolated to a standard turbine hub height using the wind profile power or log law and used to simulate the hypothetical power output of a turbine at that location; repeating this procedure using many viable locations can produce a picture of future electricity generation. However, the estimate of hub height wind speed is dependent on the choice of the wind shear exponent a or the roughness length z0, and requires a number of simplifying assumptions. This paper investigates the sensitivity of this estimation on generation output using a case study of a met station in West Freugh, Scotland. The results show that the choice of wind shear exponent is a particularly sensitive parameter which can lead to significant variation of estimated hub height wind speed and hence estimated future generation potential of a region.
Resumo:
High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared to observations during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. The parameterized convection model simulations at both 40 km and 12 km grid spacing have a very weak MJO signal and little eastward propagation. A 4 km explicit convection simulation using Smagorinsky subgrid mixing in the vertical and horizontal dimensions exhibits the best MJO strength and propagation speed. 12 km explicit convection simulations also perform much better than the 12 km parameterized convection run, suggesting that the convection scheme, rather than horizontal resolution, is key for these MJO simulations. Interestingly, a 4 km explicit convection simulation using the conventional boundary layer scheme for vertical subgrid mixing (but still using Smagorinsky horizontal mixing) completely loses the large-scale MJO organization, showing that relatively high resolution with explicit convection does not guarantee a good MJO simulation. Models with a good MJO representation have a more realistic relationship between lower-free-tropospheric moisture and precipitation, supporting the idea that moisture-convection feedback is a key process for MJO propagation. There is also increased generation of available potential energy and conversion of that energy into kinetic energy in models with a more realistic MJO, which is related to larger zonal variance in convective heating and vertical velocity, larger zonal temperature variance around 200 hPa, and larger correlations between temperature and ascent (and between temperature and diabatic heating) between 500–400 hPa.