1000 resultados para Blind Detection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Empirical Mode Decomposition (EMD) method is a commonly used method for solving the problem of single channel blind source separation (SCBSS) in signal processing. However, the mixing vector of SCBSS, which is the base of the EMD method, has not yet been effectively constructed. The mixing vector reflects the weights of original signal sources that form the single channel blind signal source. In this paper, we propose a novel method to construct a mixing vector for a single channel blind signal source to approximate the actual mixing vector in terms of keeping the same ratios between signal weights. The constructed mixing vector can be used to improve signal separations. Our method incorporates the adaptive filter, least square method, EMD method and signal source samples to construct the mixing vector. Experimental tests using audio signal evaluations were conducted and the results indicated that our method can improve the similar values of sources energy ratio from 0.2644 to 0.8366. This kind of recognition is very important in weak signal detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate the intraexaminer agreement in the detection of the mandibular canal roof (MCR) and mental foramen (MF) in panoramic radiographs. Forty panoramic radiographs of edentulous patients were used. Two calibrated examiners (A and B) read the images 2 times, for both sides independently, under blind conditions. The interval between the readings was 10 days. The intraexaminer agreement in the interpretation of MCR and MF was performed by kappa statistics with linear weighting (x). The intraexaminer agreement for the detection of MCR, in the left side, was good for both examiners (A: kappa = 0.67; B: kappa = 0.71). Related to the right side, it was found to be kappa = 0.47 and kappa = 0.62, respectively to A and B. The intraexaminer agreement for the detection of MF was good for both examiners interpreting the left side (A: kappa = 0.61; B: kappa = 0.63), and in relation to the right side, it was moderate (A: kappa = 0.51) and fair (B: kappa = 0.38). The intraexaminer agreement in the detection of MCR was good and from good to fair in the detection of MF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Dental erosion, the chemical dissolution of enamel without bacterial involvement, is a rarely reported manifestation of gastroesophageal reflux disease (GERD), as well as of recurrent vomiting and dietary habits. It leads to loss of tooth substance, hypersensitivity, functional impairment, and even tooth fracture. To date, dental erosions have been assessed using only very basic visual methods, and no evidence-based guidelines or studies exist regarding the prevention or treatment of GERD-related dental erosions. METHODS: In this randomized, double-blind study, we used optical coherence tomography (OCT) to quantify dental tissue demineralization and enamel loss before and after 3 weeks of acid-suppressive treatment with esomeprazole 20 mg b.i.d. or placebo in 30 patients presenting to the Berne University Dental Clinic with advanced dental erosions and abnormal acid exposure by 24-h esophageal pH manometry (defined as >4% of the 24-h period with pH<4). Enamel thickness, reflectivity, and absorbance as measures of demineralization were quantified by OCT before and after therapy at identical localizations on teeth with most severe visible erosions as well as several other predefined changes in teeth. RESULTS: The mean+/-s.e.m. decrease of enamel thickness of all teeth before and after treatment at the site of maximum exposure was 7.2+/-0.16 black trianglem with esomeprazole and 15.25+/-0.17black trianglem with placebo (P=0.013), representing a loss of 0.3% and 0.8% of the total enamel thickness, respectively. The change in optical reflectivity to a depth of 25 black trianglem after treatment was-1.122 +/-0.769 dB with esomeprazole and +2.059+/-0.534 dB with placebo (P 0.012), with increased reflectivity signifying demineralization. CONCLUSIONS: OCT non-invasively detected and quantified significantly diminished progression of dental tissue demineralization and enamel loss after only 3 weeks of treatment with esomeprazole 20 mg b.i.d. vs. placebo. This suggests that esomeprazole may be useful in counteracting progression of GERD-related dental erosions. Further validation of preventative treatment regimens using this sensitive detection method is required, including longer follow-up and correlation with quantitative reflux measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HYPOTHESIS Facial nerve monitoring can be used synchronous with a high-precision robotic tool as a functional warning to prevent of a collision of the drill bit with the facial nerve during direct cochlear access (DCA). BACKGROUND Minimally invasive direct cochlear access (DCA) aims to eliminate the need for a mastoidectomy by drilling a small tunnel through the facial recess to the cochlea with the aid of stereotactic tool guidance. Because the procedure is performed in a blind manner, structures such as the facial nerve are at risk. Neuromonitoring is a commonly used tool to help surgeons identify the facial nerve (FN) during routine surgical procedures in the mastoid. Recently, neuromonitoring technology was integrated into a commercially available drill system enabling real-time monitoring of the FN. The objective of this study was to determine if this drilling system could be used to warn of an impending collision with the FN during robot-assisted DCA. MATERIALS AND METHODS The sheep was chosen as a suitable model for this study because of its similarity to the human ear anatomy. The same surgical workflow applicable to human patients was performed in the animal model. Bone screws, serving as reference fiducials, were placed in the skull near the ear canal. The sheep head was imaged using a computed tomographic scanner and segmentation of FN, mastoid, and other relevant structures as well as planning of drilling trajectories was carried out using a dedicated software tool. During the actual procedure, a surgical drill system was connected to a nerve monitor and guided by a custom built robot system. As the planned trajectories were drilled, stimulation and EMG response signals were recorded. A postoperative analysis was achieved after each surgery to determine the actual drilled positions. RESULTS Using the calibrated pose synchronized with the EMG signals, the precise relationship between distance to FN and EMG with 3 different stimulation intensities could be determined for 11 different tunnels drilled in 3 different subjects. CONCLUSION From the results, it was determined that the current implementation of the neuromonitoring system lacks sensitivity and repeatability necessary to be used as a warning device in robotic DCA. We hypothesize that this is primarily because of the stimulation pattern achieved using a noninsulated drill as a stimulating probe. Further work is necessary to determine whether specific changes to the design can improve the sensitivity and specificity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the framework of the MSSM, we examine several simplified models where only a few superpartners are light. This allows us to study WIMP-nucleus scattering in terms of a handful of MSSM parameters and thereby scrutinize their impact on dark matter direct-detection experiments. Focusing on spin-independent WIMP-nucleon scattering, we derive simplified, analytic expressions for the Wilson coefficients associated with Higgs and squark exchange. We utilize these results to study the complementarity of constraints due to direct-detection, flavor, and collider experiments. We also identify parameter configurations that produce (almost) vanishing cross sections. In the proximity of these so-called blind spots, we find that the amount of isospin violation may be much larger than typically expected in the MSSM. This feature is a generic property of parameter regions where cross sections are suppressed, and highlights the importance of a careful analysis of the nucleon matrix elements and the associated hadronic uncertainties. This becomes especially relevant once the increased sensitivity of future direct-detection experiments corners the MSSM into these regions of parameter space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS Children conceived by assisted reproductive technology (ART) display vascular dysfunction. Its underlying mechanism, potential reversibility and long-term consequences for cardiovascular risk are unknown. In mice, ART induces arterial hypertension and shortens the life span. These problems are related to decreased vascular endothelial nitric oxide synthase (eNOS) expression and nitric oxide (NO) synthesis. The aim of this study was to determine whether ART-induced vascular dysfunction in humans is related to a similar mechanism and potentially reversible. To this end we tested whether antioxidants improve endothelial function by scavenging free radicals and increasing NO bioavailability. METHODS AND RESULTS In this prospective double-blind placebo controlled study in 21 ART and 21 control children we assessed the effects of a four-week oral supplementation with antioxidant vitamins C (1 g) and E (400 IU) or placebo (allocation ratio 2:1) on flow-mediated vasodilation (FMD) of the brachial artery and pulmonary artery pressure (echocardiography) during high-altitude exposure (3454 m), a manoeuver known to facilitate the detection of pulmonary vascular dysfunction and to decrease NO bioavailability by stimulating oxidative stress. Antioxidant supplementation significantly increased plasma NO measured by ozone-based chemiluminescence (from 21.7 ± 7.9 to 26.9 ± 7.6 µM, p = 0.04) and FMD (from 7.0 ± 2.1 to 8.7 ± 2.0%, p = 0.004) and attenuated altitude-induced pulmonary hypertension (from 33 ± 8 to 28 ± 6 mm Hg, p = 0.028) in ART children, whereas it had no detectable effect in control children. CONCLUSIONS Antioxidant administration to ART children improved NO bioavailability and vascular responsiveness in the systemic and pulmonary circulation. Collectively, these findings indicate that in young individuals ART-induced vascular dysfunction is subject to redox regulation and reversible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnetoencephalogram (MEG) is contaminated with undesired signals, which are called artifacts. Some of the most important ones are the cardiac and the ocular artifacts (CA and OA, respectively), and the power line noise (PLN). Blind source separation (BSS) has been used to reduce the influence of the artifacts in the data. There is a plethora of BSS-based artifact removal approaches, but few comparative analyses. In this study, MEG background activity from 26 subjects was processed with five widespread BSS (AMUSE, SOBI, JADE, extended Infomax, and FastICA) and one constrained BSS (cBSS) techniques. Then, the ability of several combinations of BSS algorithm, epoch length, and artifact detection metric to automatically reduce the CA, OA, and PLN were quantified with objective criteria. The results pinpointed to cBSS as a very suitable approach to remove the CA. Additionally, a combination of AMUSE or SOBI and artifact detection metrics based on entropy or power criteria decreased the OA. Finally, the PLN was reduced by means of a spectral metric. These findings confirm the utility of BSS to help in the artifact removal for MEG background activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A blind nonlinear interference cancellation receiver for code-division multiple-access- (CDMA-) based communication systems operating over Rayleigh flat-fading channels is proposed. The receiver which assumes knowledge of the signature waveforms of all the users is implemented in an asynchronous CDMA environment. Unlike the conventional MMSE receiver, the proposed blind ICA multiuser detector is shown to be robust without training sequences and with only knowledge of the signature waveforms. It has achieved nearly the same performance of the conventional training-based MMSE receiver. Several comparisons and experiments are performed based on examining BER performance in AWGN and Rayleigh fading in order to verify the validity of the proposed blind ICA multiuser detector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider blind signal detection in an asynchronous code-division multiple-access (CDMA) system employing short spreading sequences in the presence of unknown multipath fading. This approach is capable of countering the presence of multiple-access interference (MAI) in CDMA fading channels. The proposed blind multiuser detector is based on an independent component analysis (ICA) to mitigate both MAI and noise. This algorithm has been utilised in blind source separation (BSS) of unknown sources from their mixtures. It can also be used for estimating the basis vectors of BSS. The aim is to include an ICA algorithm within a wireless receiver in order to reduce the level of interference in wideband systems. This blind multiuser detector requires no training sequence compared with the conventional multiuser detection receiver. The proposed ICA blind multiuser detector is made robust with respect to knowledge of signature waveforms and the timing of the user of interest. Several experiments are performed in order to verify the validity of the proposed ICA algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 9/11 Act mandates the inspection of 100% of cargo shipments entering the U.S. by 2012 and 100% inspection of air cargo by March 2010. So far, only 5% of inbound shipping containers are inspected thoroughly while air cargo inspections have fared better at 50%. Government officials have admitted that these milestones cannot be met since the appropriate technology does not exist. This research presents a novel planar solid phase microextraction (PSPME) device with enhanced surface area and capacity for collection of the volatile chemical signatures in air that are emitted from illicit compounds for direct introduction into ion mobility spectrometers (IMS) for detection. These IMS detectors are widely used to detect particles of illicit substances and do not have to be adapted specifically to this technology. For static extractions, PDMS and sol-gel PDMS PSPME devices provide significant increases in sensitivity over conventional fiber SPME. Results show a 50–400 times increase in mass detected of piperonal and a 2–4 times increase for TNT. In a blind study of 6 cases suspected to contain varying amounts of MDMA, PSPME-IMS correctly detected 5 positive cases with no false positives or negatives. One of these cases had minimal amounts of MDMA resulting in a false negative response for fiber SPME-IMS. A La (dihed) phase chemistry has shown an increase in the extraction efficiency of TNT and 2,4-DNT and enhanced retention over time. An alternative PSPME device was also developed for the rapid (seconds) dynamic sampling and preconcentration of large volumes of air for direct thermal desorption into an IMS. This device affords high extraction efficiencies due to strong retention properties under ambient conditions resulting in ppt detection limits when 3.5 L of air are sampled over the course of 10 seconds. Dynamic PSPME was used to sample the headspace over the following: MDMA tablets (12–40 ng detected of piperonal), high explosives (Pentolite) (0.6 ng detected of TNT), and several smokeless powders (26–35 ng of 2,4-DNT and 11–74 ng DPA detected). PSPME-IMS technology is flexible to end-user needs, is low-cost, rapid, sensitive, easy to use, easy to implement, and effective. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finding rare events in multidimensional data is an important detection problem that has applications in many fields, such as risk estimation in insurance industry, finance, flood prediction, medical diagnosis, quality assurance, security, or safety in transportation. The occurrence of such anomalies is so infrequent that there is usually not enough training data to learn an accurate statistical model of the anomaly class. In some cases, such events may have never been observed, so the only information that is available is a set of normal samples and an assumed pairwise similarity function. Such metric may only be known up to a certain number of unspecified parameters, which would either need to be learned from training data, or fixed by a domain expert. Sometimes, the anomalous condition may be formulated algebraically, such as a measure exceeding a predefined threshold, but nuisance variables may complicate the estimation of such a measure. Change detection methods used in time series analysis are not easily extendable to the multidimensional case, where discontinuities are not localized to a single point. On the other hand, in higher dimensions, data exhibits more complex interdependencies, and there is redundancy that could be exploited to adaptively model the normal data. In the first part of this dissertation, we review the theoretical framework for anomaly detection in images and previous anomaly detection work done in the context of crack detection and detection of anomalous components in railway tracks. In the second part, we propose new anomaly detection algorithms. The fact that curvilinear discontinuities in images are sparse with respect to the frame of shearlets, allows us to pose this anomaly detection problem as basis pursuit optimization. Therefore, we pose the problem of detecting curvilinear anomalies in noisy textured images as a blind source separation problem under sparsity constraints, and propose an iterative shrinkage algorithm to solve it. Taking advantage of the parallel nature of this algorithm, we describe how this method can be accelerated using graphical processing units (GPU). Then, we propose a new method for finding defective components on railway tracks using cameras mounted on a train. We describe how to extract features and use a combination of classifiers to solve this problem. Then, we scale anomaly detection to bigger datasets with complex interdependencies. We show that the anomaly detection problem naturally fits in the multitask learning framework. The first task consists of learning a compact representation of the good samples, while the second task consists of learning the anomaly detector. Using deep convolutional neural networks, we show that it is possible to train a deep model with a limited number of anomalous examples. In sequential detection problems, the presence of time-variant nuisance parameters affect the detection performance. In the last part of this dissertation, we present a method for adaptively estimating the threshold of sequential detectors using Extreme Value Theory on a Bayesian framework. Finally, conclusions on the results obtained are provided, followed by a discussion of possible future work.