951 resultados para Testing Source Code Generation
Resumo:
Mode of access: Internet.
Resumo:
Work performed at CANEL.
Resumo:
"Printed: May 1990."
Resumo:
"ILENR/RE-WR-93/01."
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-04
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
A low-density, male-based linkage map was constructed as one of the objectives of the International Equine Gene Mapping Workshop. Here we report the second generation map based on testing 503 half-sibling offspring from 13 sire families for 344 informative markers using the crimap program. The multipoint linkage analysis localized 310 markers (90%) with 257 markers being linearly ordered. The map included 34 linkage groups representing all 31 autosomes and spanning 2262 cM with an average interval between loci of 10.1 cM. This map is a milestone in that it is the first map with linkage groups assigned to each of the 31 automosomes and a single linkage group to all but three chromosomes.
Resumo:
Re-Os data for chromite separates from 10 massive chromitite seams sampled along the 550-km length of the 2.58-Ga Great Dyke layered igneous complex, Zimbabwe, record initial 187Os/188Os ratios in the relatively narrow range between 0.1106 and 0.1126. This range of initial 187Os/188Os values is only slightly higher than the value for the coeval primitive upper mantle (0.1107) as modeled from the Re-Os evolution of chondrites and data of modern mantle melts and mantle derived xenoliths. Analyses of Archean granitoid and gneiss samples from the Zimbabwe Craton show extremely low Os concentrations (3-9 ppt) with surprisingly unradiogenic present-day 187Os/188Os signatures between 0.167 and 0.297. Only one sample yields an elevated 187Os/188Os ratio of 1.008. Using these data, the range of crustal contamination of the Great Dyke magma would be minimally 0%-33% if the magma source was the primitive upper mantle, whereas the range estimated from Nd and Pb isotope systematics is 5%-25%. If it is assumed that the primary Great Dyke magma derived from an enriched deep mantle reservoir (via a plume), a better agreement can be obtained. A significant contribution from a long-lived subcontinental lithospheric mantle (SCLM) reservoir with subchondritic Re/Os to the Great Dyke melts cannot be reconciled with the Os isotope results at all. However, Os isotope data on pre-Great Dyke ultramafic complexes of the Zimbabwe Craton and thermal modeling show that such an SCLM existed below the Zimbabwe Craton at the time of the Great Dyke intrusion. It is therefore concluded that large melt volumes such as that giving rise to the Great Dyke were able to pass lithospheric mantle keels without significant contamination in the late Archean. Because the ultramafic-mafic melts forming the Great Dyke must have originated below the SCLM (which extends to at least a 200-km depth ), the absence of an SCLM signature precludes a subduction-related magma-generation process.
Resumo:
Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively. This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.
Resumo:
Denial is a commonly used strategy to rebut a false rumor. However, there is a dearth of empirical research on the effectiveness of denials in combating rumors. Treating denials as persuasive messages, we conducted 3 laboratory-based simulation studies testing the overall effectiveness of denials in reducing belief and anxiety associated with an e-mail virus rumor. Under the framework of the elaboration likelihood model, we also tested the effects of denial message quality and source credibility, and the moderating effects of personal relevance. Overall, the results provided some support for the effectiveness of denials with strong arguments and an anxiety-alleviating tone in reducing rumor-related belief and anxiety. The effects of denial wording and source credibility were visible for participants who perceived high personal relevance of the topic. Limitations of the current research and future research directions are discussed.
Resumo:
We study Greenberger-Horne-Zeilinger-type (GHZ-type) and W-type three-mode entangled coherent states. Both types of entangled coherent states violate Mermin's version of the Bell inequality with threshold photon detection (i.e., without photon counting). Such an experiment can be performed using linear optics elements and threshold detectors with significant Bell violations for GHZ-type entangled coherent states. However, to demonstrate Bell-type inequality violations for W-type entangled coherent states, additional nonlinear interactions are needed. We also propose an optical scheme to generate W-type entangled coherent states in free-traveling optical fields. The required resources for the generation are a single-photon source, a coherent state source, beam splitters, phase shifters, photodetectors, and Kerr nonlinearities. Our scheme does not necessarily require strong Kerr nonlinear interactions; i.e., weak nonlinearities can be used for the generation of the W-type entangled coherent states. Furthermore, it is also robust against inefficiencies of the single-photon source and the photon detectors.
Resumo:
A novel method that relies on the decoupling of the energy production and biosynthesis processes was used to characterise the maintenance, cell lysis and growth processes of Nitrosomonas sp. A Nitrosolnonas culture was enriched in a sequencing batch reactor (SBR) with ammonium as the sole energy source. Fluorescent in situ hybridization (FISH) showed that Nitrosomonas bound to the NEU probe constituted 82% of the bacterial population, while no other known ammonium or nitrite oxidizing bacteria were detected. Batch tests were carried out under conditions that both ammonium and CO, were in excess, and in the absence of one of these two substrates. The oxygen uptake rate and nitrite production rate were measured during these batch tests. The results obtained from these batch tests, along with the SBR performance data, allowed the determination of the maintenance coefficient and the in situ cell lysis rate, as well as the maximum specific growth rate of the Nitrosomonas culture. It is shown that, during normal growth, the Nitrosomonas culture spends approximately 65% of the energy generated for maintenance. The maintenance coefficient was determined to be 0.14 - 0.16 mgN mgCOD(biomass)(-1) h(-1), and was shown to be independent of the specific growth rate. The in situ lysis rate and the maximum specific growth rate of the Nitrosomonas culture were determined to be 0.26 and 1.0 day(-1) (0.043 h(-1)), respectively, under aerobic conditions at 30 degrees C and pH7. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Testing concurrent software is difficult due to problems with inherent nondeterminism. In previous work, we have presented a method and tool support for the testing of concurrent Java components. In this paper, we extend that work by presenting and discussing techniques for testing Java thread interrupts and timed waits. Testing thread interrupts is important because every Java component that calls wait must have code dealing with these interrupts. For a component that uses interrupts and timed waits to provide its basic functionality, the ability to test these features is clearly even more important. We discuss the application of the techniques and tool support to one such component, which is a nontrivial implementation of the readers-writers problem.
Resumo:
Starting with a UML specification that captures the underlying functionality of some given Java-based concurrent system, we describe a systematic way to construct, from this specification, test sequences for validating an implementation of the system. The approach is to first extend the specification to create UML state machines that directly address those aspects of the system we wish to test. To be specific, the extended UML state machines can capture state information about the number of waiting threads or the number of threads blocked on a given object. Using the SAL model checker we can generate from the extended UML state machines sequences that cover all the various possibilities of events and states. These sequences can then be directly transformed into test sequences suitable for input into a testing tool such as ConAn. As an illustration, the methodology is applied to generate sequences for testing a Java implementation of the producer-consumer system. © 2005 IEEE
Resumo:
In this thesis, we consider four different scenarios of interest in modern satellite communications. For each scenario, we will propose the use of advanced solutions aimed at increasing the spectral efficiency of the communication links. First, we will investigate the optimization of the current standard for digital video broadcasting. We will increase the symbol rate of the signal and determine the optimal signal bandwidth. We will apply the time packing technique and propose a specifically design constellation. We will then compare some receiver architectures with different performance and complexity. The second scenario still addresses broadcast transmissions, but in a network composed of two satellites. We will compare three alternative transceiver strategies, namely, signals completely overlapped in frequency, frequency division multiplexing, and the Alamouti space-time block code, and, for each technique, we will derive theoretical results on the achievable rates. We will also evaluate the performance of said techniques in three different channel models. The third scenario deals with the application of multiuser detection in multibeam satellite systems. We will analyze a case in which the users are near the edge of the coverage area and, hence, they experience a high level of interference from adjacent cells. Also in this case, three different approaches will be compared. A classical approach in which each beam carries information for a user, a cooperative solution based on time division multiplexing, and the Alamouti scheme. The information theoretical analysis will be followed by the study of practical coded schemes. We will show that the theoretical bounds can be approached by a properly designed code or bit mapping. Finally, we will consider an Earth observation scenario, in which data is generated on the satellite and then transmitted to the ground. We will study two channel models, taking into account one or two transmit antennas, and apply techniques such as time and frequency packing, signal predistortion, multiuser detection and the Alamouti scheme.