951 resultados para Testing Source Code Generation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various unification schemes interpret the complex phenomenology of quasars and luminous active galactic nuclei (AGN) in terms of a simple picture involving a central black hole, an accretion disc and an associated outflow. Here, we continue our tests of this paradigm by comparing quasar spectra to synthetic spectra of biconical disc wind models, produced with our state-of-the-art Monte Carlo radiative transfer code. Previously, we have shown that we could produce synthetic spectra resembling those of observed broad absorption line (BAL) quasars, but only if the X-ray luminosity was limited to 1043 erg s-1. Here, we introduce a simple treatment of clumping, and find that a filling factor of ˜0.01 moderates the ionization state sufficiently for BAL features to form in the rest-frame UV at more realistic X-ray luminosities. Our fiducial model shows good agreement with AGN X-ray properties and the wind produces strong line emission in, e.g., Lyα and C IV 1550 Å at low inclinations. At high inclinations, the spectra possess prominent LoBAL features. Despite these successes, we cannot reproduce all emission lines seen in quasar spectra with the correct equivalent-width ratios, and we find an angular dependence of emission line equivalent width despite the similarities in the observed emission line properties of BAL and non-BAL quasars. Overall, our work suggests that biconical winds can reproduce much of the qualitative behaviour expected from a unified model, but we cannot yet provide quantitative matches with quasar properties at all viewing angles. Whether disc winds can successfully unify quasars is therefore still an open question.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a mechanism for testing the theory of collapse models such as continuous spontaneous localization (CSL) by examining the parametric heating rate of a trapped nanosphere. The random localizations of the center-of-mass for a given particle predicted by the CSL model can be understood as a stochastic force embodying a source of heating for the nanosphere. We show that by utilising a Paul trap to levitate the particle and optical cooling, it is possible to reduce environmental decoher- ence to such a level that CSL dominates the dynamics and contributes the main source of heating. We show that this approach allows measurements to be made on the timescale of seconds, and that the free parameter λcsl which characterises the model ought to be testable to values as low as 10^{−12} Hz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A small scale sample nuclear waste package, consisting of a 28 mm diameter uranium penny encased in grout, was imaged by absorption contrast radiography using a single pulse exposure from an X-ray source driven by a high-power laser. The Vulcan laser was used to deliver a focused pulse of photons to a tantalum foil, in order to generate a bright burst of highly penetrating X-rays (with energy >500 keV), with a source size of <0.5 mm. BAS-TR and BAS-SR image plates were used for image capture, alongside a newly developed Thalium doped Caesium Iodide scintillator-based detector coupled to CCD chips. The uranium penny was clearly resolved to sub-mm accuracy over a 30 cm2 scan area from a single shot acquisition. In addition, neutron generation was demonstrated in situ with the X-ray beam, with a single shot, thus demonstrating the potential for multi-modal criticality testing of waste materials. This feasibility study successfully demonstrated non-destructive radiography of encapsulated, high density, nuclear material. With recent developments of high-power laser systems, to 10 Hz operation, a laser-driven multi-modal beamline for waste monitoring applications is envisioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A grid-connected DFIG for wind power generation can affect power system small-signal angular stability in two ways: by changing the system load flow condition and dynamically interacting with synchronous generators (SGs). This paper presents the application of conventional method of damping torque analysis (DTA) to examine the effect of DFIG’s dynamic interactions with SGs on the small-signal angular stability. It shows that the effect is due to the dynamic variation of power exchange between the DFIG and power system and can be estimated approximately by the DTA. Consequently, if the DFIG is modelled as a constant power source when the effect of zero dynamic interactions is assumed, the impact of change of load flow brought about by the DFIG can be determined. Thus the total effect of DFIG can be estimated from the result of DTA added on that of constant power source model. Applications of the DTA method proposed in the paper are discussed. An example of multi-machine power systems with grid-connected DFIGs are presented to demonstrate and validate the DTA method proposed and conclusions obtained in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lung cancer diagnostics have progressed greatly in the previous decade. Development of molecular testing to identify an increasing number of potentially clinically actionable genetic variants, using smaller samples obtained via minimally invasive techniques, is a huge challenge. Tumour heterogeneity and cancer evolution in response to therapy means that repeat biopsies or circulating biomarkers are likely to be increasingly useful to adapt treatment as resistance develops. We highlight some of the current challenges faced in clinical practice for molecular testing of EGFR, ALK, and new biomarkers such as PDL1. Implementation of next generation sequencing platforms for molecular diagnostics in non-small-cell lung cancer is increasingly common, allowing testing of multiple genetic variants from a single sample. The use of next generation sequencing to recruit for molecularly stratified clinical trials is discussed in the context of the UK Stratified Medicine Programme and The UK National Lung Matrix Trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ageing and deterioration of infrastructure is a challenge facing transport authorities. In
particular, there is a need for increased bridge monitoring in order to provide adequate
maintenance and to guarantee acceptable levels of transport safety. The Intelligent
Infrastructure group at Queens University Belfast (QUB) are working on a number of aspects
of infrastructure monitoring and this paper presents summarised results from three distinct
monitoring projects carried out by this group. Firstly the findings from a project on next
generation Bridge Weight in Motion (B-WIM) are reported, this includes full scale field testing
using fibre optic strain sensors. Secondly, results from early phase testing of a computer
vision system for bridge deflection monitoring are reported on. This research seeks to exploit
recent advances in image processing technology with a view to developing contactless
bridge monitoring approaches. Considering the logistical difficulty of installing sensors on a
‘live’ bridge, contactless monitoring has some inherent advantages over conventional
contact based sensing systems. Finally the last section of the paper presents some recent
findings on drive by bridge monitoring. In practice a drive-by monitoring system will likely
require GPS to allow the response of a given bridge to be identified; this study looks at the
feasibility of using low-cost GPS sensors for this purpose, via field trials. The three topics
outlined above cover a spectrum of SHM approaches namely, wired monitoring, contactless
monitoring and drive by monitoring

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Germany the upscaling algorithm is currently the standard approach for evaluating the PV power produced in a region. This method involves spatially interpolating the normalized power of a set of reference PV plants to estimate the power production by another set of unknown plants. As little information on the performances of this method could be found in the literature, the first goal of this thesis is to conduct an analysis of the uncertainty associated to this method. It was found that this method can lead to large errors when the set of reference plants has different characteristics or weather conditions than the set of unknown plants and when the set of reference plants is small. Based on these preliminary findings, an alternative method is proposed for calculating the aggregate power production of a set of PV plants. A probabilistic approach has been chosen by which a power production is calculated at each PV plant from corresponding weather data. The probabilistic approach consists of evaluating the power for each frequently occurring value of the parameters and estimating the most probable value by averaging these power values weighted by their frequency of occurrence. Most frequent parameter sets (e.g. module azimuth and tilt angle) and their frequency of occurrence have been assessed on the basis of a statistical analysis of parameters of approx. 35 000 PV plants. It has been found that the plant parameters are statistically dependent on the size and location of the PV plants. Accordingly, separate statistical values have been assessed for 14 classes of nominal capacity and 95 regions in Germany (two-digit zip-code areas). The performances of the upscaling and probabilistic approaches have been compared on the basis of 15 min power measurements from 715 PV plants provided by the German distribution system operator LEW Verteilnetz. It was found that the error of the probabilistic method is smaller than that of the upscaling method when the number of reference plants is sufficiently large (>100 reference plants in the case study considered in this chapter). When the number of reference plants is limited (<50 reference plants for the considered case study), it was found that the proposed approach provides a noticeable gain in accuracy with respect to the upscaling method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research question- This thesis investigates the determinants of capital structure of the Swedish companies. In order to do so, the two dominant theories of the corporate structure are studied and their assumptions are tested. Thus, the study researches which one of the two theories is more appealing for the Swedish market. Methodology-The study follows a purely quantitative study, by conducting an econometric analysis. The data are collected from a secondary source and more particularly the "Retriever" database, which contains financial data of the Swedish companies. Findings- The findings indicate that the determinants of the corporate structure for the Swedish market do not differ from other studies which have been conducted in other countries. However, there is a difference when it comes to tax and non-tax shields. The results suggest that in most cases the Pecking Order Theory appears to be more representative for the Swedish market, since most of the coefficient appear to be in favour of it. Moreover, the significance of the effect of the industry for the financial leverage is confirmed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Gamma-band oscillations are prominently impaired in schizophrenia, but the nature of the deficit and relationship to perceptual processes is unclear. Methods: 16 patients with chronic schizophrenia (ScZ) and 16 age-matched healthy controls completed a visual paradigm while magnetoencephalographic (MEG) data was recorded. Participants had to detect randomly occurring stimulus acceleration while viewing a concentric moving grating. MEG data were analyzed for spectral power (1-100 Hz) at sensorand source-level to examine the brain regions involved in aberrant rhythmic activity, and for contribution of differences in baseline activity towards the generation of low- and highfrequency power. Results: Our data show reduced gamma-band power at sensor level in schizophrenia patients during stimulus processing while alpha-band and baseline spectrum were intact. Differences in oscillatory activity correlated with reduced behavioral detection rates in the schizophrenia group and higher scores on the “Cognitive Factor” of the Positive and Negative Syndrome Scale. Source reconstruction revealed that extra-striate (fusiform/lingual gyrus), but not striate (cuneus), visual cortices contributed towards the reduced activity observed at sensorlevel in ScZ patients. Importantly, differences in stimulus-related activity were not due to differences in baseline activity. Conclusions: Our findings highlight that MEG-measured high-frequency oscillations during visual processing can be robustly identified in ScZ. Our data further suggest impairments that involve dysfunctions in ventral stream processing and a failure to increase gamma-band activity in a task-context. Implications of these findings are discussed in the context of current theories of cortical-subcortical circuit dysfunctions and perceptual processing in ScZ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the evolution of the direct and indirect pathways of allorecognition following tissue transplantation is essential in the design of tolerance-promoting protocols. On the basis that donor bone marrow-derived antigen presenting cells are eliminated within days of transplantation, it has been argued that the indirect response represents the major threat to long term transplant survival, and is consequently the key target for regulation. However, the detection of MHC transfer between cells, and particularly the capture of MHC:peptide complexes by dendritic cells, led us to propose a third, semi-direct, pathway of MHC allorecognition. Persistence of this pathway would lead to sustained activation of direct pathway T cells, arguably persisting for the life of the transplant. In this study, we focused on the contribution of acquired MHC class I, on recipient DCs, during the life span of a skin graft. We observed that MHC class I acquisition by recipient DCs occurs for at least one month following transplantation and may be the main source of alloantigen that drives CD8+ cytotoxic T cell responses. In addition, acquired MHC class I-peptide complexes stimulate T cell responses in vivo further emphasizing the need to regulate both pathways to induce indefinite survival of the graft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a very long baseline atom interferometer test of Einstein's equivalence principle (EEP) with ytterbium and rubidium extending over 10m of free fall. In view of existing parametrizations of EEP violations, this choice of test masses significantly broadens the scope of atom interferometric EEP tests with respect to other performed or proposed tests by comparing two elements with high atomic numbfers. In the first step, our experimental scheme will allow us to reach an accuracy in the Eotvos ratio of 7 . 10(-13). This achievement will constrain violation scenarios beyond our present knowledge and will represent an important milestone for exploring a variety of schemes for further improvements of the tests as outlined in the paper. We will discuss the technical realisation in the new infrastructure of the Hanover Institute of Technology (HITec) and give a short overview of the requirements needed to reach this accuracy. The experiment will demonstrate a variety of techniques, which will be employed in future tests of EEP, high-accuracy gravimetry and gravity gradiometry. It includes operation of a force-sensitive atom interferometer with an alkaline earth-like element in free fall, beam splitting over macroscopic distances and novel source concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the suitability of an Einstein-Podolsky-Rosen entanglement source for Gaussian continuous-variable quantum key distribution at 1550 nm. Our source is based on a single continuous-wave squeezed vacuum mode combined with a vacuum mode at a balanced beam splitter. Extending a recent security proof, we characterize the source by quantifying the extractable length of a composable secure key from a finite number of samples under the assumption of collective attacks. We show that distances in the order of 10 km are achievable with this source for a reasonable sample size despite the fact that the entanglement was generated including a vacuum mode. Our security analysis applies to all states having an asymmetry in the field quadrature variances, including those generated by superposition of two squeezed modes with different squeezing strengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La vérification de la résistance aux attaques des implémentations embarquées des vérifieurs de code intermédiaire Java Card est une tâche complexe. Les méthodes actuelles n'étant pas suffisamment efficaces, seule la génération de tests manuelle est possible. Pour automatiser ce processus, nous proposons une méthode appelée VTG (Vulnerability Test Generation, génération de tests de vulnérabilité). En se basant sur une représentation formelle des comportements fonctionnels du système sous test, un ensemble de tests d'intrusions est généré. Cette méthode s'inspire des techniques de mutation et de test à base de modèle. Dans un premier temps, le modèle est muté selon des règles que nous avons définies afin de représenter les potentielles attaques. Les tests sont ensuite extraits à partir des modèles mutants. Deux modèles Event-B ont été proposés. Le premier représente les contraintes structurelles des fichiers d'application Java Card. Le VTG permet en quelques secondes de générer des centaines de tests abstraits. Le second modèle est composé de 66 événements permettant de représenter 61 instructions Java Card. La mutation est effectuée en quelques secondes. L'extraction des tests permet de générer 223 tests en 45 min. Chaque test permet de vérifier une précondition ou une combinaison de préconditions d'une instruction. Cette méthode nous a permis de tester différents mécanismes d'implémentations de vérifieur de code intermédiaire Java Card. Bien que développée pour notre cas d'étude, la méthode proposée est générique et a été appliquée à d'autres cas d'études.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.