904 resultados para detection method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A low cost, disposable instrument for measuring solar radiation during meteorological balloon flights through cloud layers is described. Using a photodiode detector and low thermal drift signal conditioning circuitry, the device showed less than 1% drift for temperatures varied from +20 °C to −35 °C. The angular response to radiation, which declined less rapidly than the cosine of the angle between the incident radiation and normal incidence, is used for cloud detection exploiting the motion of the platform. Oriented upwards, the natural motion imposed by the balloon allows cloud and clear air to be distinguished by the absence of radiation variability within cloud, where the diffuse radiation present is isotropic. The optical method employed by the solar radiation instrument has also been demonstrated to provide higher resolution measurements of cloud boundaries than relative humidity measurements alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Very high-resolution Synthetic Aperture Radar sensors represent an alternative to aerial photography for delineating floods in built-up environments where flood risk is highest. However, even with currently available SAR image resolutions of 3 m and higher, signal returns from man-made structures hamper the accurate mapping of flooded areas. Enhanced image processing algorithms and a better exploitation of image archives are required to facilitate the use of microwave remote sensing data for monitoring flood dynamics in urban areas. In this study a hybrid methodology combining radiometric thresholding, region growing and change detection is introduced as an approach enabling the automated, objective and reliable flood extent extraction from very high-resolution urban SAR images. The method is based on the calibration of a statistical distribution of “open water” backscatter values inferred from SAR images of floods. SAR images acquired during dry conditions enable the identification of areas i) that are not “visible” to the sensor (i.e. regions affected by ‘layover’ and ‘shadow’) and ii) that systematically behave as specular reflectors (e.g. smooth tarmac, permanent water bodies). Change detection with respect to a pre- or post flood reference image thereby reduces over-detection of inundated areas. A case study of the July 2007 Severn River flood (UK) observed by the very high-resolution SAR sensor on board TerraSAR-X as well as airborne photography highlights advantages and limitations of the proposed method. We conclude that even though the fully automated SAR-based flood mapping technique overcomes some limitations of previous methods, further technological and methodological improvements are necessary for SAR-based flood detection in urban areas to match the flood mapping capability of high quality aerial photography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A polymerase chain reaction (PCR) assay was developed to detect Chlamydia psittaci DNA in faeces and tissue samples from avian species. Primers were designed to amplify a 264 bp product derived from part of the 5' non-translated region and part of the coding region of the ompA gene which encodes the major outer membrane protein. Amplified sequences were confirmed by Southern hybridization using an internal probe. The sensitivity of the combined assay was found to be between 60 to 600 fg of chlamydial DNA (approximately 6 to 60 genome copies). The specificity of the assay was confirmed since PCR product was not obtained from samples containing several serotypes of C. trachomatis, strains of C. pneumoniae, the type strain of C. pecorum, nor from samples containing microorganisms commonly found in the avian gut flora. In this study, 404 avian faeces and 141 avian tissue samples received by the Central Veterinary Laboratory over a 6 month period were analysed by PCR, antigen detection ELISA and where possible, cell culture isolation. PCR performed favourably compared with ELISA and cell culture, or with ELISA alone. The PCR assay was especially suited to the detection of C. psittaci DNA in avian faeces samples. The test was also useful when applied to tissue samples from small contact birds associated with a case of human psittacosis where ELISA results were negative and chlamydial isolation was a less favourable method due to the need for rapid diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Denaturing high-performance liquid chromatography (DHPLC) was evaluated as a rapid screening and identification method for DNA sequence variation detection in the quinolone resistance-determining region of gyrA from Salmonella serovars. A total of 203 isolates of Salmonella were screened using this method. DHPLC analysis of 14 isolates representing each type of novel or multiple mutations and the wild type were compared with LightCycler-based PCR-gyrA hybridization mutation assay (GAMA) and single-strand conformational polymorphism (SSCP) analyses. The 14 isolates gave seven different SSCP patterns, and LightCycler detected four different mutations. DHPLC detected 11 DNA sequence variants at eight different codons, including those detected by LightCycler or SSCP. One of these mutations was silent. Five isolates contained multiple mutations, and four of these could be distinguished from the composite sequence variants by their DHPLC profile. Seven novel mutations were identified at five different loci not previously described in quinolone-resistant salmonella. DHPLC analysis proved advantageous for the detection of novel and multiple mutations. DHPLC also provides a rapid, high-throughput alternative to LightCycler and SSCP for screening frequently occurring mutations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset—the period 1989–2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northern Hemisphere cyclone activity is assessed by applying an algorithm for the detection and tracking of synoptic scale cyclones to mean sea level pressure data. The method, originally developed for the Southern Hemisphere, is adapted for application in the Northern Hemisphere winter season. NCEP-Reanalysis data from 1958/59 to 1997/98 are used as input. The sensitivities of the results to particular parameters of the algorithm are discussed for both case studies and from a climatological point of view. Results show that the choice of settings is of major relevance especially for the tracking of smaller scale and fast moving systems. With an appropriate setting the algorithm is capable of automatically tracking different types of cyclones at the same time: Both fast moving and developing systems over the large ocean basins and smaller scale cyclones over the Mediterranean basin can be assessed. The climatology of cyclone variables, e.g., cyclone track density, cyclone counts, intensification rates, propagation speeds and areas of cyclogenesis and -lysis gives detailed information on typical cyclone life cycles for different regions. The lowering of the spatial and temporal resolution of the input data from full resolution T62/06h to T42/12h decreases the cyclone track density and cyclone counts. Reducing the temporal resolution alone contributes to a decline in the number of fast moving systems, which is relevant for the cyclone track density. Lowering spatial resolution alone mainly reduces the number of weak cyclones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a fluorometric assay for heme synthetase, the enzyme that is genetically deficient in erythropoietic protoporphyria. The method, which can readily detect activity in 1 microliter of packed human lymphocytes, is based on the formation of zinc protoheme from protoporphyrin IX. That zinc chelatase and ferrochelatase activities reside in the same enzyme was shown by the competitive action of ferrous ions and the inhibitory effects of N-methyl protoporphyrin (a specific inhibitor of heme synthetase) on zinc chelatase. The Km for zinc was 11 micrograms and that for protoporphyrin IX was 6 microM. The Ki fro ferrous ions was 14 microM. Zinc chelatase was reduced to 15.3% of the mean control activity in lymphocytes obtained from patients with protoporphyria, thus confirming the defect of heme biosynthesis in this disorder. The assay should prove to be useful for determining heme synthetase in tissues with low specific activity and to investigate further the enzymatic defect in protoporphyria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding situation, the method could lead to improved detection of flooding in urban areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epigenetic regulations play important roles in plant development and adaptation to environmental stress. Recent studies from mammalian systems have demonstrated the involvement of ten-eleven translocation (Tet) family of dioxygenases in the generation of a series of oxidized derivatives of 5-methylcytosine (5-mC) in mammalian DNA. In addition, these oxidized 5-mC nucleobases have important roles in epigenetic remodeling and aberrant levels of 5-hydroxymethyl-29-deoxycytidine (5-HmdC) were found to be associated with different types of human cancers. However, there is a lack of evidence supporting the presence of these modified bases in plant DNA. Here we reported the use of a reversed-phase HPLC coupled with tandem mass spectrometry method and stable isotope-labeled standards for assessing the levels of the oxidized 5-mC nucleosides along with two other oxidatively induced DNA modifications in genomic DNA of Arabidopsis. These included 5- HmdC, 5-formyl-29-deoxycytidine (5-FodC), 5-carboxyl-29-deoxycytidine (5-CadC), 5-hydroxymethyl-29-deoxyuridine (5- HmdU), and the (59S) diastereomer of 8,59-cyclo-29-deoxyguanosine (S-cdG). We found that, in Arabidopsis DNA, the levels of 5-HmdC, 5-FodC, and 5-CadC are approximately 0.8 modifications per 106 nucleosides, with the frequency of 5-HmdC (per 5-mdC) being comparable to that of 5-HmdU (per thymidine). The relatively low levels of the 5-mdC oxidation products suggest that they arise likely from reactive oxygen species present in cells, which is in line with the lack of homologous Tetfamily dioxygenase enzymes in Arabidopsis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of antibiotics in birds and animals intended for human consumption within the European Union (EU) and elsewhere has been subject to regulation prohibiting the use of antimicrobials as growth promoters and the use of last resort antibiotics in an attempt to reduce the spread of multi-resistant Gram negative bacteria. Given the inexorable spread of antibiotic resistance there is an increasing need for improved monitoring of our food. Using selective media, Gram negative bacteria were isolated from retail chicken of UK-Intensively reared (n = 27), Irish-Intensively reared (n = 19) and UK-Free range (n = 30) origin and subjected to an oligonucleotide based array system for the detection of 47 clinically relevant antibiotic resistance genes (ARGs) and two integrase genes. High incidences of β-lactamase genes were noted in all sample types, acc (67%), cmy (80%), fox (55%) and tem (40%) while chloramphenicol resistant determinants were detected in bacteria from the UK poultry portions and were absent in bacteria from the Irish samples. Denaturing Gradient Gel Electrophoresis (DGGE) was used to qualitatively analyse the Gram negative population in the samples and showed the expected diversity based on band stabbing and DNA sequencing. The array system proved to be a quick method for the detection of antibiotic resistance gene (ARG) burden within a mixed Gram negative bacterial population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim To develop a brief, parent-completed instrument (‘ERIC’) for detection of cognitive delay in 10-24 month-olds born preterm, or with low birth weight, or with perinatal complications, and to establish its diagnostic properties. Method Scores were collected from parents of 317 children meeting ≥1 inclusion criteria (birth weight <1500g; gestational age <34 completed weeks; 5-minute Apgar <7; presence of hypoxic-ischemic encephalopathy) and meeting no exclusion criteria. Children were assessed for cognitive delay using a criterion score on the Bayley Scales of Infant and Toddler Development Cognitive Scale III1 <80. Items were retained according to their individual associations with delay. Sensitivity, specificity, Positive and Negative Predictive Values were estimated and a truncated ERIC was developed for use <14 months. Results ERIC detected 17 out of 18 delayed children in the sample, with 94.4% sensitivity (95% CI [confidence interval] 83.9-100%), 76.9% specificity (72.1-81.7%), 19.8% positive predictive value (11.4-28.2%); 99.6% negative predictive value (98.7-100%); 4.09 likelihood ratio positive; and 0.07 likelihood ratio negative; the associated Area under the Curve was .909 (.829-.960). Interpretation ERIC has potential value as a quickly-administered diagnostic instrument for the absence of early cognitive delay in preterm or premature infants of 10-24 months, and as a screen for cognitive delay. Further research may be needed before ERIC can be recommended for wide-scale use.