985 resultados para Noise detection
Resumo:
PURPOSE: To determine the lower limit of dose reduction with hybrid and fully iterative reconstruction algorithms in detection of endoleaks and in-stent thrombus of thoracic aorta with computed tomographic (CT) angiography by applying protocols with different tube energies and automated tube current modulation. MATERIALS AND METHODS: The calcification insert of an anthropomorphic cardiac phantom was replaced with an aortic aneurysm model containing a stent, simulated endoleaks, and an intraluminal thrombus. CT was performed at tube energies of 120, 100, and 80 kVp with incrementally increasing noise indexes (NIs) of 16, 25, 34, 43, 52, 61, and 70 and a 2.5-mm section thickness. NI directly controls radiation exposure; a higher NI allows for greater image noise and decreases radiation. Images were reconstructed with filtered back projection (FBP) and hybrid and fully iterative algorithms. Five radiologists independently analyzed lesion conspicuity to assess sensitivity and specificity. Mean attenuation (in Hounsfield units) and standard deviation were measured in the aorta to calculate signal-to-noise ratio (SNR). Attenuation and SNR of different protocols and algorithms were analyzed with analysis of variance or Welch test depending on data distribution. RESULTS: Both sensitivity and specificity were 100% for simulated lesions on images with 2.5-mm section thickness and an NI of 25 (3.45 mGy), 34 (1.83 mGy), or 43 (1.16 mGy) at 120 kVp; an NI of 34 (1.98 mGy), 43 (1.23 mGy), or 61 (0.61 mGy) at 100 kVp; and an NI of 43 (1.46 mGy) or 70 (0.54 mGy) at 80 kVp. SNR values showed similar results. With the fully iterative algorithm, mean attenuation of the aorta decreased significantly in reduced-dose protocols in comparison with control protocols at 100 kVp (311 HU at 16 NI vs 290 HU at 70 NI, P ≤ .0011) and 80 kVp (400 HU at 16 NI vs 369 HU at 70 NI, P ≤ .0007). CONCLUSION: Endoleaks and in-stent thrombus of thoracic aorta were detectable to 1.46 mGy (80 kVp) with FBP, 1.23 mGy (100 kVp) with the hybrid algorithm, and 0.54 mGy (80 kVp) with the fully iterative algorithm.
Resumo:
Vibration-based damage identification (VBDI) techniques have been developed in part to address the problems associated with an aging civil infrastructure. To assess the potential of VBDI as it applies to highway bridges in Iowa, three applications of VBDI techniques were considered in this study: numerical simulation, laboratory structures, and field structures. VBDI techniques were found to be highly capable of locating and quantifying damage in numerical simulations. These same techniques were found to be accurate in locating various types of damage in a laboratory setting with actual structures. Although there is the potential for these techniques to quantify damage in a laboratory setting, the ability of the methods to quantify low-level damage in the laboratory is not robust. When applying these techniques to an actual bridge, it was found that some traditional applications of VBDI methods are capable of describing the global behavior of the structure but are most likely not suited for the identification of typical damage scenarios found in civil infrastructure. Measurement noise, boundary conditions, complications due to substructures and multiple material types, and transducer sensitivity make it very difficult for present VBDI techniques to identify, much less quantify, highly localized damage (such as small cracks and minor changes in thickness). However, while investigating VBDI techniques in the field, it was found that if the frequency-domain response of the structure can be generated from operating traffic load, the structural response can be animated and used to develop a holistic view of the bridge’s response to various automobile loadings. By animating the response of a field bridge, concrete cracking (in the abutment and deck) was correlated with structural motion and problem frequencies (i.e., those that cause significant torsion or tension-compression at beam ends) were identified. Furthermore, a frequency-domain study of operational traffic was used to identify both common and extreme frequencies for a given structure and loading. Common traffic frequencies can be compared to problem frequencies so that cost-effective, preventative solutions (either structural or usage-based) can be developed for a wide range of IDOT bridges. Further work should (1) perfect the process of collecting high-quality operational frequency response data; (2) expand and simplify the process of correlating frequency response animations with damage; and (3) develop efficient, economical, preemptive solutions to common damage types.
Resumo:
We present a method to detect patterns in defocused scenes by means of a joint transform correlator. We describe analytically the correlation plane, and we also introduce an original procedure to recognize the target by postprocessing the correlation plane. The performance of the methodology when the defocused images are corrupted by additive noise is also considered.
Resumo:
Coalescing compact binary systems are important sources of gravitational waves. Here we investigate the detectability of this gravitational radiation by the recently proposed laser interferometers. The spectral density of noise for various practicable configurations of the detector is also reviewed. This includes laser interferometers with delay lines and Fabry-Prot cavities in the arms, both in standard and dual recycling arrangements. The sensitivity of the detector in all those configurations is presented graphically and the signal-to-noise ratio is calculated numerically. For all configurations we find values of the detector's parameters which maximize the detectability of coalescing binaries, the discussion comprising Newtonian- as well as post-Newtonian-order effects. Contour plots of the signal-to-noise ratio are also presented in certain parameter domains which illustrate the interferometer's response to coalescing binary signals.
Resumo:
We analyze the consequences that the choice of the output of the system has in the efficiency of signal detection. It is shown that the output signal and the signal-to-noise ratio (SNR), used to characterize the phenomenon of stochastic resonance, strongly depend on the form of the output. In particular, the SNR may be enhanced for an adequate output.
Resumo:
In this paper we propose an endpoint detection system based on the use of several features extracted from each speech frame, followed by a robust classifier (i.e Adaboost and Bagging of decision trees, and a multilayer perceptron) and a finite state automata (FSA). We present results for four different classifiers. The FSA module consisted of a 4-state decision logic that filtered false alarms and false positives. We compare the use of four different classifiers in this task. The look ahead of the method that we propose was of 7 frames, which are the number of frames that maximized the accuracy of the system. The system was tested with real signals recorded inside a car, with signal to noise ratio that ranged from 6 dB to 30dB. Finally we present experimental results demonstrating that the system yields robust endpoint detection.
Resumo:
Iowa has approximately 1000 bridges that have been overlaid with a nominal 2" of portland cement concrete. A Delamtect survey of a sampling of the older overlaid bridges indicated delaminations in several of them. Eventually these bridges as well as those that have not received an overlay must be programmed for rehabilitation. Prior to rehabilitation the areas which are delaminated must be identified. There are currently two standard methods of determining delaminated areas in bridge decks; sounding with a metal object or a chain drag and sounding with an electro-mechanical sounding system (Delamtect). Sounding with a metal object or chain drag is time consuming and the accuracy is dependent on the ear of the operator and may be affected by traffic noise. The Delamtect requires less field time but the graphical traces require that data reduction be done in the office. A recently developed method of detecting delamination is infrared thermography. This method is based on the temperature difference between sound and delaminated concrete. A contract was negotiated with Donohue and Associates, Inc. of Sheboygan, Wisconsin, to survey 18 p.c. concrete overlaid bridge decks in Iowa using the infrared thermography method of detecting delaminations.
Resumo:
Reaching a consensus in terms of interchangeability and utility (i.e., disease detection/monitoring) of a medical device is the eventual aim of repeatability and agreement studies. The aim of the tolerance and relative utility indices described in this report is to provide a methodology to compare change in clinical measurement noise between different populations (repeatability) or measurement methods (agreement), so as to highlight problematic areas. No longitudinal data are required to calculate these indices. Both indices establish a metric of least to most effected across all parameters to facilitate comparison. If validated, these indices may prove useful tools when combining reports and forming the consensus required in the validation process for software updates and new medical devices.
Resumo:
OBJECTIVES: To determine inter-session and intra/inter-individual variations of the attenuations of aortic blood/myocardium with MDCT in the context of calcium scoring. To evaluate whether these variations are dependent on patients' characteristics. METHODS: Fifty-four volunteers were evaluated with calcium scoring non-enhanced CT. We measured attenuations (inter-individual variation) and standard deviations (SD, intra-individual variation) of the blood in the ascending aorta and of the myocardium of left ventricle. Every volunteer was examined twice to study the inter-session variation. The fat pad thickness at the sternum and noise (SD of air) were measured too. These values were correlated with the measured aortic/ventricular attenuations and their SDs (Pearson). Historically fixed thresholds (90 and 130 HU) were tested against different models based on attenuations of blood/ventricle. RESULTS: The mean attenuation was 46 HU (range, 17-84 HU) with mean SD 23 HU for the blood, and 39 HU (10-82 HU) with mean SD 18 HU for the myocardium. The attenuation/SD of the blood were significantly higher than those of the myocardium (p < 0.01). The inter-session variation was not significant. There was a poor correlation between SD of aortic blood/ventricle with fat thickness/noise. Based on existing models, 90 HU threshold offers a confidence interval of approximately 95% and 130 HU more than 99%. CONCLUSIONS: Historical thresholds offer high confidence intervals for exclusion of aortic blood/myocardium and by the way for detecting calcifications. Nevertheless, considering the large variations of blood/myocardium CT values and the influence of patient's characteristics, a better approach might be an adaptive threshold.
Resumo:
X-ray medical imaging is increasingly becoming three-dimensional (3-D). The dose to the population and its management are of special concern in computed tomography (CT). Task-based methods with model observers to assess the dose-image quality trade-off are promising tools, but they still need to be validated for real volumetric images. The purpose of the present work is to evaluate anthropomorphic model observers in 3-D detection tasks for low-contrast CT images. We scanned a low-contrast phantom containing four types of signals at three dose levels and used two reconstruction algorithms. We implemented a multislice model observer based on the channelized Hotelling observer (msCHO) with anthropomorphic channels and investigated different internal noise methods. We found a good correlation for all tested model observers. These results suggest that the msCHO can be used as a relevant task-based method to evaluate low-contrast detection for CT and optimize scan protocols to lower dose in an efficient way.
Resumo:
Binary probes are oligonucleotide probe pairs that hybridize adjacently to a complementary target nucleic acid. In order to detect this hybridization, the two probes can be modified with, for example, fluorescent molecules, chemically reactive groups or nucleic acid enzymes. The benefit of this kind of binary probe based approach is that the hybridization elicits a detectable signal which is distinguishable from background noise even though unbound probes are not removed by washing before measurement. In addition, the requirement of two simultaneous binding events increases specificity. Similarly to binary oligonucleotide probes, also certain enzymes and fluorescent proteins can be divided into two parts and used in separation-free assays. Split enzyme and fluorescent protein reporters have practical applications among others as tools to investigate protein-protein interactions within living cells. In this study, a novel label technology, switchable lanthanide luminescence, was introduced and used successfully in model assays for nucleic acid and protein detection. This label technology is based on a luminescent lanthanide chelate divided into two inherently non-luminescent moieties, an ion carrier chelate and a light harvesting antenna ligand. These form a highly luminescent complex when brought into close proximity; i.e., the label moieties switch from a dark state to a luminescent state. This kind of mixed lanthanide complex has the same beneficial photophysical properties as the more typical lanthanide chelates and cryptates - sharp emission peaks, long emission lifetime enabling time-resolved measurement, and large Stokes’ shift, which minimize the background signal. Furthermore, the switchable lanthanide luminescence technique enables a homogeneous assay set-up. Here, switchable lanthanide luminescence label technology was first applied to sensitive, homogeneous, single-target nucleic acid and protein assays with picomolar detection limits and high signal to background ratios. Thereafter, a homogeneous four-plex nucleic acid array-based assay was developed. Finally, the label technology was shown to be effective in discrimination of single nucleotide mismatched targets from fully matched targets and the luminescent complex formation was analyzed more thoroughly. In conclusion, this study demonstrates that the switchable lanthanide luminescencebased label technology can be used in various homogeneous bioanalytical assays.
Resumo:
Ventricular late potentials are low-amplitude signals originating from damaged myocardium and detected on the body surface by ECG filtering and averaging. Digital filters present in commercial equipment may interfere with the ability of arrhythmia stratification. We compared 40-Hz BiSpec (BI) and classical 40- to 250-Hz band-pass Butterworth bidirectional (BD) filters in terms of impact on time domain variables and diagnostic properties. In a transverse retrospective age-adjusted case-control study, 221 subjects with sinus rhythm without bundle branch block were divided into three groups after signal-averaged ECG acquisition: GI (N = 40), clinically normal controls, GII (N = 158), subjects with coronary heart disease without sustained monomorphic ventricular tachycardia (SMVT), and GIII (N = 23), subjects with heart disease and documented SMVT. Conventional variables analyzed from vector magnitude data after averaging to 0.3 µV final noise were obtained by application of each filter to the averaged signal, and evaluated in pairs by numerical comparison and by diagnostic agreement assessment, using conventional and optimized thresholds of normality. Significant differences were found between BI and BD variables in all groups, with diagnostic results showing significant disagreement between both filters [kappa value of 0.61 (P<0.05) for GII and 0.31 for GIII (P = NS)]. Sensitivity for SMVT was lower with BI than with BD (65.2 vs 91.3%, respectively, P<0.05). Filters provided significantly different numerical and diagnostic results and the BI filter showed only limited clinical application to risk stratification of ventricular arrhythmia.
Resumo:
Functional MRI (fMRI) resting-state experiments are aimed at identifying brain networks that support basal brain function. Although most investigators consider a ‘resting-state’ fMRI experiment with no specific external stimulation, subjects are unavoidably under heavy acoustic noise produced by the equipment. In the present study, we evaluated the influence of auditory input on the resting-state networks (RSNs). Twenty-two healthy subjects were scanned using two similar echo-planar imaging sequences in the same 3T MRI scanner: a default pulse sequence and a reduced “silent” pulse sequence. Experimental sessions consisted of two consecutive 7-min runs with noise conditions (default or silent) counterbalanced across subjects. A self-organizing group independent component analysis was applied to fMRI data in order to recognize the RSNs. The insula, left middle frontal gyrus and right precentral and left inferior parietal lobules showed significant differences in the voxel-wise comparison between RSNs depending on noise condition. In the presence of low-level noise, these areas Granger-cause oscillations in RSNs with cognitive implications (dorsal attention and entorhinal), while during high noise acquisition, these connectivities are reduced or inverted. Applying low noise MR acquisitions in research may allow the detection of subtle differences of the RSNs, with implications in experimental planning for resting-state studies, data analysis, and ergonomic factors.
Resumo:
Epilepsy is a chronic brain disorder, characterized by reoccurring seizures. Automatic sei-zure detector, incorporated into a mobile closed-loop system, can improve the quality of life for the people with epilepsy. Commercial EEG headbands, such as Emotiv Epoc, have a potential to be used as the data acquisition devices for such a system. In order to estimate that potential, epileptic EEG signals from the commercial devices were emulated in this work based on the EEG data from a clinical dataset. The emulated characteristics include the referencing scheme, the set of electrodes used, the sampling rate, the sample resolution and the noise level. Performance of the existing algorithm for detection of epileptic seizures, developed in the context of clinical data, has been evaluated on the emulated commercial data. The results show, that after the transformation of the data towards the characteristics of Emotiv Epoc, the detection capabilities of the algorithm are mostly preserved. The ranges of acceptable changes in the signal parameters are also estimated.
Resumo:
The operation of a previously proposed terahertz (THZ) detector is formulated in detail. The detector is based on the hot-electron effect of the 2D electron gas (2DEG) in the quantum well (QW) of a GaAs/AIGaAs heterostructure. The interaction between the THz radiation and the 2DEG, the current enhancement due to hot -electron effect, and the noise performance of the detector are analyzed