252 resultados para test automation
Resumo:
Background: Lung clearance index (LCI) derived from sulfur hexafluoride (SF6) multiple breath washout (MBW) is a sensitive measure of lung disease in people with cystic fibrosis (CF). However, it can be time-consuming, limiting its use clinically. Aim: To compare the repeatability, sensitivity and test duration of LCI derived from washout to 1/30th (LCI1/30), 1/20th (LCI1/20) and 1/10th (LCI1/10) to ‘standard’ LCI derived from washout to 1/40th initial concentration (LCI1/40). Methods: Triplicate MBW test results from 30 clinically stable people with CF and 30 healthy controls were analysed retrospectively. MBW tests were performed using 0.2% SF6 and a modified Innocor device. All LCI end points were calculated using SimpleWashout software. Repeatability was assessed using coefficient of variation (CV%). The proportion of people with CF with and without abnormal LCI and forced expiratory volume in 1 s (FEV1) % predicted was compared. Receiver operating characteristic (ROC) curve statistics were calculated. Test duration of all LCI end points was compared using paired t tests. Results: In people with CF, LCI1/40 CV% (p=0.16), LCI1/30 CV%, (p=0.53), LCI1/20 CV% (p=0.14) and LCI1/10 CV% (p=0.25) was not significantly different to controls. The sensitivity of LCI1/40, LCI1/30 and LCI1/20 to the presence of CF was equal (67%). The sensitivity of LCI1/10 and FEV1% predicted was lower (53% and 47% respectively). Area under the ROC curve (95% CI) for LCI1/40, LCI1/30, LCI1/20, LCI1/10 and FEV1% predicted was 0.89 (0.80 to 0.97), 0.87 (0.77 to 0.96), 0.87 (0.78 to 0.96), 0.83 (0.72 to 0.94) and 0.73 (0.60 to 0.86), respectively. Test duration of LCI1/30, LCI1/20 and LCI1/10 was significantly shorter compared with the test duration of LCI1/40 in people with CF (p<0.0001) equating to a 5%, 9% and 15% time saving, respectively. Conclusions: In this study, LCI1/20 was a repeatable and sensitive measure with equal diagnostic performance to LCI1/40. LCI1/20 was shorter, potentially offering a more feasible research and clinical measure.
Resumo:
Current variation aware design methodologies, tuned for worst-case scenarios, are becoming increasingly pessimistic from the perspective of power and performance. A good example of such pessimism is setting the refresh rate of DRAMs according to the worst-case access statistics, thereby resulting in very frequent refresh cycles, which are responsible for the majority of the standby power consumption of these memories. However, such a high refresh rate may not be required, either due to extremely low probability of the actual occurrence of such a worst-case, or due to the inherent error resilient nature of many applications that can tolerate a certain number of potential failures. In this paper, we exploit and quantify the possibilities that exist in dynamic memory design by shifting to the so-called approximate computing paradigm in order to save power and enhance yield at no cost. The statistical characteristics of the retention time in dynamic memories were revealed by studying a fabricated 2kb CMOS compatible embedded DRAM (eDRAM) memory array based on gain-cells. Measurements show that up to 73% of the retention power can be saved by altering the refresh time and setting it such that a small number of failures is allowed. We show that these savings can be further increased by utilizing known circuit techniques, such as body biasing, which can help, not only in extending, but also in preferably shaping the retention time distribution. Our approach is one of the first attempts to access the data integrity and energy tradeoffs achieved in eDRAMs for utilizing them in error resilient applications and can prove helpful in the anticipated shift to approximate computing.
Resumo:
Static timing analysis provides the basis for setting the clock period of a microprocessor core, based on its worst-case critical path. However, depending on the design, this critical path is not always excited and therefore dynamic timing margins exist that can theoretically be exploited for the benefit of better speed or lower power consumption (through voltage scaling). This paper introduces predictive instruction-based dynamic clock adjustment as a technique to trim dynamic timing margins in pipelined microprocessors. To this end, we exploit the different timing requirements for individual instructions during the dynamically varying program execution flow without the need for complex circuit-level measures to detect and correct timing violations. We provide a design flow to extract the dynamic timing information for the design using post-layout dynamic timing analysis and we integrate the results into a custom cycle-accurate simulator. This simulator allows annotation of individual instructions with their impact on timing (in each pipeline stage) and rapidly derives the overall code execution time for complex benchmarks. The design methodology is illustrated at the microarchitecture level, demonstrating the performance and power gains possible on a 6-stage OpenRISC in-order general purpose processor core in a 28nm CMOS technology. We show that employing instruction-dependent dynamic clock adjustment leads on average to an increase in operating speed by 38% or to a reduction in power consumption by 24%, compared to traditional synchronous clocking, which at all times has to respect the worst-case timing identified through static timing analysis.
Resumo:
The worsening of process variations and the consequent increased spreads in circuit performance and consumed power hinder the satisfaction of the targeted budgets and lead to yield loss. Corner based design and adoption of design guardbands might limit the yield loss. However, in many cases such methods may not be able to capture the real effects which might be way better than the predicted ones leading to increasingly pessimistic designs. The situation is even more severe in memories which consist of substantially different individual building blocks, further complicating the accurate analysis of the impact of variations at the architecture level leaving many potential issues uncovered and opportunities unexploited. In this paper, we develop a framework for capturing non-trivial statistical interactions among all the components of a memory/cache. The developed tool is able to find the optimum memory/cache configuration under various constraints allowing the designers to make the right choices early in the design cycle and consequently improve performance, energy, and especially yield. Our, results indicate that the consideration of the architectural interactions between the memory components allow to relax the pessimistic access times that are predicted by existing techniques.
Resumo:
Background: A novel lateral flow, immunochromatographic assay (LFD) specific for Mycobacterium bovis, the cause of bovine tuberculosis and zoonotic TB, was recently developed at Queen’s University Belfast. The LFD detects whole M. bovis cells, in contrast to other commercially available LFD tests (BD MGITTM TBc ID, SD Bioline TB Ag MPT 64, Capilia TB-Neo kit) which detect MPT64 antigen secreted during growth. The new LFD test has been evaluated in the veterinary context, and its specificity for M. bovis in the broadest sense (i.e. subsp. bovis, subsp. caprae and BCG) and sensitivity to detect M. bovis in positive MGIT™ liquid cultures was demonstrated comprehensively.
Methods: Preliminary work was carried out by researchers at Queen’s University Belfast to optimise sputum sample preparation, estimate the limit of detection (LOD) of the LFD with M. bovis-spiked sputum samples, and check LFD specificity by testing a broad range of non-tuberculous Mycobacterium spp. (NTM) and other bacterial genera commonly encountered in sputum samples (Haemophilus, Klebsiella, Pseudomonas, Staphylococcus). In the Cameroon laboratory direct detection of M. bovis in human sputa was attempted, and 50 positive sputum MGIT™ cultures and 33 cultures of various Mycobacterium spp. originally isolated from human sputa were tested.
Results: Sputum sample preparation consisted of digestion with 1% NALC for 30 min, centrifugation at 3000g for 20 min, PBS wash, centrifugation again, and pellet resuspended in KPL blocking buffer before 100 µl was applied to the LFD. The LOD of the LFD applied to M. bovis-spiked sputum was estimated to be 104 CFU/ml. A small number of confirmed Ziehl-Neelsen ‘3+’ M. bovis positive sputum samples were tested directly but no positive LFD results were obtained. All of the sputum MGIT™ cultures and mycobacterial cultures (including M. tuberculosis, M. africanum, M. bovis, M. intracellulare, M. scrofulaceum, M. fortuitum, M. peregrinum, M. interjectum) tested LFD negative when read after 15 min except for the M. bovis cultures, thereby confirming specificity of LFD for M. bovis in the clinical microbiology context.
Conclusions: Results indicate that the ‘Rapid-bTB’ LFD is a very specific test, able to differentiate M. bovis from M. tuberculosis, M. africanum, and a range of NTM isolated from human sputa in MGITTM liquid cultures. However, the LFD lacks sufficient sensitivity to be applied earlier in the diagnostic process to directly test human sputa.
Resumo:
With the development and deployment of IEC 61850 based smart substations, cybersecurity vulnerabilities of supervisory control and data acquisition (SCADA) systems are increasingly emerging. In response to the emergence of cybersecurity vulnerabilities in smart substations, a test-bed is indispensable to enable cybersecurity experimentation. In this paper, a comprehensive and realistic cyber-physical test-bed has been built to investigate potential cybersecurity vulnerabilities and the impact of cyber-attacks on IEC 61850 based smart substations. This test-bed is close to a real production type environment, and has the ability to carry out end-to-end testing of cyber-attacks and physical consequences. A fuzz testing approach is proposed for detecting IEC 61850 based intelligent electronic devices (IEDs) and validated in the proposed test-bed.
Resumo:
Reasoning that is deliberative and reflective often requires the inhibition of intuitive responses. The Cognitive Reflection Test (CRT) is designed to assess people’s ability to suppress incorrect heuristic responses in favour of deliberation. Correct responding on the CRT predicts performance on a range of tasks in which intuitive processes lead to incorrect responses, suggesting indirectly that CRT performance is related to cognitive control. Yet little is known about the cognitive processes underlying performance on the CRT. In the current research, we employed a novel mouse tracking mjavascript:void(0);ethodology to capture the time-course of reasoning on the CRT. Analysis of mouse cursor trajectories revealed that participants were initially drawn towards the incorrect (i.e., intuitive) option even when the correct (deliberative) option was ultimately chosen. Conversely, participants were not attracted to the correct option when they ultimately chose the incorrect intuitive one. We conclude that intuitive processes are activated automatically on the CRT and must be inhibited in order to respond correctly. When participants responded intuitively, there was no evidence that deliberative reasoning had become engaged.
Resumo:
This paper presents initial results of evaluating suitability of the conventional two-tone CW passive intermodulation (PIM) test for characterization of modulated signal distortion by passive nonlinearities in base station antennas and RF front-end. A comprehensive analysis of analog and digitally modulated waveforms in the transmission lines with weak distributed nonlinearity has been performed using the harmonic balance analysis and X-parameters in Advanced Design System (ADS) simulator. The nonlinear distortion metrics used in the conventional two-tone CW PIM test have been compared with the respective spectral metrics applied to the modulated waveforms, such as adjacent channel power ratio (ACPR) and error vector magnitude (EVM). It is shown that the results of two-tone CW PIM tests are consistent with the metrics used for assessment of signal integrity of both analog and digitally modulated waveforms.