888 resultados para Edge detection method
Resumo:
Tradicionalment, la reproducció del mon real se'ns ha mostrat a traves d'imatges planes. Aquestes imatges se solien materialitzar mitjançant pintures sobre tela o be amb dibuixos. Avui, per sort, encara podem veure pintures fetes a ma, tot i que la majoria d'imatges s'adquireixen mitjançant càmeres, i es mostren directament a una audiència, com en el cinema, la televisió o exposicions de fotografies, o be son processades per un sistema computeritzat per tal d'obtenir un resultat en particular. Aquests processaments s'apliquen en camps com en el control de qualitat industrial o be en la recerca mes puntera en intel·ligència artificial. Aplicant algorismes de processament de nivell mitja es poden obtenir imatges 3D a partir d'imatges 2D, utilitzant tècniques ben conegudes anomenades Shape From X, on X es el mètode per obtenir la tercera dimensió, i varia en funció de la tècnica que s'utilitza a tal nalitat. Tot i que l'evolució cap a la càmera 3D va començar en els 90, cal que les tècniques per obtenir les formes tridimensionals siguin mes i mes acurades. Les aplicacions dels escàners 3D han augmentat considerablement en els darrers anys, especialment en camps com el lleure, diagnosi/cirurgia assistida, robòtica, etc. Una de les tècniques mes utilitzades per obtenir informació 3D d'una escena, es la triangulació, i mes concretament, la utilització d'escàners laser tridimensionals. Des de la seva aparició formal en publicacions científiques al 1971 [SS71], hi ha hagut contribucions per solucionar problemes inherents com ara la disminució d'oclusions, millora de la precisió, velocitat d'adquisició, descripció de la forma, etc. Tots i cadascun dels mètodes per obtenir punts 3D d'una escena te associat un procés de calibració, i aquest procés juga un paper decisiu en el rendiment d'un dispositiu d'adquisició tridimensional. La nalitat d'aquesta tesi es la d'abordar el problema de l'adquisició de forma 3D, des d'un punt de vista total, reportant un estat de l'art sobre escàners laser basats en triangulació, provant el funcionament i rendiment de diferents sistemes, i fent aportacions per millorar la precisió en la detecció del feix laser, especialment en condicions adverses, i solucionant el problema de la calibració a partir de mètodes geomètrics projectius.
Resumo:
The goal of this work is the numerical realization of the probe method suggested by Ikehata for the detection of an obstacle D in inverse scattering. The main idea of the method is to use probes in the form of point source (., z) with source point z to define an indicator function (I) over cap (z) which can be reconstructed from Cauchy data or far. eld data. The indicator function boolean AND (I) over cap (z) can be shown to blow off when the source point z tends to the boundary aD, and this behavior can be used to find D. To study the feasibility of the probe method we will use two equivalent formulations of the indicator function. We will carry out the numerical realization of the functional and show reconstructions of a sound-soft obstacle.
Resumo:
We present a method to enhance fault localization for software systems based on a frequent pattern mining algorithm. Our method is based on a large set of test cases for a given set of programs in which faults can be detected. The test executions are recorded as function call trees. Based on test oracles the tests can be classified into successful and failing tests. A frequent pattern mining algorithm is used to identify frequent subtrees in successful and failing test executions. This information is used to rank functions according to their likelihood of containing a fault. The ranking suggests an order in which to examine the functions during fault analysis. We validate our approach experimentally using a subset of Siemens benchmark programs.
Resumo:
Platelets are small blood cells vital for hemostasis. Following vascular damage, platelets adhere to collagens and activate, forming a thrombus that plugs the wound and prevents blood loss. Stimulation of the platelet collagen receptor glycoprotein VI (GPVI) allows recruitment of proteins to receptor-proximal signaling complexes on the inner-leaflet of the plasma membrane. These proteins are often present at low concentrations; therefore, signaling-complex characterization using mass spectrometry is limited due to high sample complexity. We describe a method that facilitates detection of signaling proteins concentrated on membranes. Peripheral membrane proteins (reversibly associated with membranes) were eluted from human platelets with alkaline sodium carbonate. Liquid-phase isoelectric focusing and gel electrophoresis were used to identify proteins that changed in levels on membranes from GPVI-stimulated platelets. Immunoblot analysis verified protein recruitment to platelet membranes and subsequent protein phosphorylation was preserved. Hsp47, a collagen binding protein, was among the proteins identified and found to be exposed on the surface of GPVI-activated platelets. Inhibition of Hsp47 abolished platelet aggregation in response to collagen, while only partially reducing aggregation in response to other platelet agonists. We propose that Hsp47 may therefore play a role in hemostasis and thrombosis.
Resumo:
The distribution of sulphate-reducing bacteria (SRB) in the sediments of the Colne River estuary, Essex, UK covering different saline concentrations of sediment porewater was investigated by the use of quantitative competitive PCR. Here, we show that a new PCR primer set and a new quantitative method using PCR are useful tools for the detection and the enumeration of SRB in natural environments. A PCR primer set selective for the dissimilatory sulphite reductase gene (dsr) of SRB was designed. PCR amplification using the single set of dsr-specific primers resulted in PCR products of the expected size from all 27 SRB strains tested, including Gram-negative and positive species. Sixty clones derived from sediment DNA using the primers were sequenced and all were closely related with the predicted dsr of SRB. These results indicate that PCR using the newly designed primer set are useful for the selective detection of SRB from a natural sample. This primer set was used to estimate cell numbers by dsr selective competitive PCR using a competitor, which was about 20% shorter than the targeted region of dsr. This procedure was applied to sediment samples from the River Colne estuary, Essex, UK together with simultaneous measurement of in situ rates of sulphate reduction. High densities of SRB ranging from 0.2 - 5.7 × 108 cells ml-1 wet sediment were estimated by the competitive PCR assuming that all SRB have a single copy of dsr. Using these estimates cell specific sulphate reduction rates of 10-17 to 10-15 mol of SO42- cell-1 day-1 were calculated, which is within the range of, or lower than, those previously reported for pure cultures of SRB. Our results show that the newly developed competitive PCR technique targeted to dsr is a powerful tool for rapid and reproducible estimation of SRB numbers in situ and is superior to the use of culture-dependent techniques.
Resumo:
A method is described for the analysis of deuterated and undeuterated alpha-tocopherol in blood components using liquid chromatography coupled to an orthogonal acceleration time-of-flight (TOF) mass spectrometer. Optimal ionisation conditions for undeuterated (d0) and tri- and hexadeuterated (d3 or d6) alpha-tocopherol standards were found with negative ion mode electrospray ionisation. Each species produced an isotopically resolved single ion of exact mass. Calibration curves of pure standards were linear in the range tested (0-1.5 muM, 0-15 pmol injected). For quantification of d0 and d6 in blood components following a standard solvent extraction, a stable-isotope-labelled internal standard (d3-alpha-tocopherol) was employed. To counter matrix ion suppression effects, standard response curves were generated following identical solvent extraction procedures to those of the samples. Within-day and between-day precision were determined for quantification of d0- and d6-labelled alpha-tocopherol in each blood component and both averaged 3-10%. Accuracy was assessed by comparison with a standard high-performance liquid chromatography (HPLC) method, achieving good correlation (r(2) = 0.94), and by spiking with known concentrations of alpha-tocopherol (98% accuracy). Limits of detection and quantification were determined to be 5 and 50 fmol injected, respectively. The assay was used to measure the appearance and disappearance of deuterium-labelled alpha-tocopherol in human blood components following deuterium-labelled (d6) RRR-alpha-tocopheryl acetate ingestion. The new LC/TOFMS method was found to be sensitive, required small sample volumes, was reproducible and robust, and was capable of high throughput when large numbers of samples were generated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
This paper represents the first step in an on-going work for designing an unsupervised method based on genetic algorithm for intrusion detection. Its main role in a broader system is to notify of an unusual traffic and in that way provide the possibility of detecting unknown attacks. Most of the machine-learning techniques deployed for intrusion detection are supervised as these techniques are generally more accurate, but this implies the need of labeling the data for training and testing which is time-consuming and error-prone. Hence, our goal is to devise an anomaly detector which would be unsupervised, but at the same time robust and accurate. Genetic algorithms are robust and able to avoid getting stuck in local optima, unlike the rest of clustering techniques. The model is verified on KDD99 benchmark dataset, generating a solution competitive with the solutions of the state-of-the-art which demonstrates high possibilities of the proposed method.
Resumo:
In this work a hybrid technique that includes probabilistic and optimization based methods is presented. The method is applied, both in simulation and by means of real-time experiments, to the heating unit of a Heating, Ventilation Air Conditioning (HVAC) system. It is shown that the addition of the probabilistic approach improves the fault diagnosis accuracy.
Resumo:
Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The level set method is commonly used to address image noise removal. Existing studies concentrate mainly on determining the speed function of the evolution equation. Based on the idea of a Canny operator, this letter introduces a new method of controlling the level set evolution, in which the edge strength is taken into account in choosing curvature flows for the speed function and the normal to edge direction is used to orient the diffusion of the moving interface. The addition of an energy term to penalize the irregularity allows for better preservation of local edge information. In contrast with previous Canny-based level set methods that usually adopt a two-stage framework, the proposed algorithm can execute all the above operations in one process during noise removal.
Resumo:
A low cost, disposable instrument for measuring solar radiation during meteorological balloon flights through cloud layers is described. Using a photodiode detector and low thermal drift signal conditioning circuitry, the device showed less than 1% drift for temperatures varied from +20 °C to −35 °C. The angular response to radiation, which declined less rapidly than the cosine of the angle between the incident radiation and normal incidence, is used for cloud detection exploiting the motion of the platform. Oriented upwards, the natural motion imposed by the balloon allows cloud and clear air to be distinguished by the absence of radiation variability within cloud, where the diffuse radiation present is isotropic. The optical method employed by the solar radiation instrument has also been demonstrated to provide higher resolution measurements of cloud boundaries than relative humidity measurements alone.
Resumo:
Very high-resolution Synthetic Aperture Radar sensors represent an alternative to aerial photography for delineating floods in built-up environments where flood risk is highest. However, even with currently available SAR image resolutions of 3 m and higher, signal returns from man-made structures hamper the accurate mapping of flooded areas. Enhanced image processing algorithms and a better exploitation of image archives are required to facilitate the use of microwave remote sensing data for monitoring flood dynamics in urban areas. In this study a hybrid methodology combining radiometric thresholding, region growing and change detection is introduced as an approach enabling the automated, objective and reliable flood extent extraction from very high-resolution urban SAR images. The method is based on the calibration of a statistical distribution of “open water” backscatter values inferred from SAR images of floods. SAR images acquired during dry conditions enable the identification of areas i) that are not “visible” to the sensor (i.e. regions affected by ‘layover’ and ‘shadow’) and ii) that systematically behave as specular reflectors (e.g. smooth tarmac, permanent water bodies). Change detection with respect to a pre- or post flood reference image thereby reduces over-detection of inundated areas. A case study of the July 2007 Severn River flood (UK) observed by the very high-resolution SAR sensor on board TerraSAR-X as well as airborne photography highlights advantages and limitations of the proposed method. We conclude that even though the fully automated SAR-based flood mapping technique overcomes some limitations of previous methods, further technological and methodological improvements are necessary for SAR-based flood detection in urban areas to match the flood mapping capability of high quality aerial photography.
Resumo:
A polymerase chain reaction (PCR) assay was developed to detect Chlamydia psittaci DNA in faeces and tissue samples from avian species. Primers were designed to amplify a 264 bp product derived from part of the 5' non-translated region and part of the coding region of the ompA gene which encodes the major outer membrane protein. Amplified sequences were confirmed by Southern hybridization using an internal probe. The sensitivity of the combined assay was found to be between 60 to 600 fg of chlamydial DNA (approximately 6 to 60 genome copies). The specificity of the assay was confirmed since PCR product was not obtained from samples containing several serotypes of C. trachomatis, strains of C. pneumoniae, the type strain of C. pecorum, nor from samples containing microorganisms commonly found in the avian gut flora. In this study, 404 avian faeces and 141 avian tissue samples received by the Central Veterinary Laboratory over a 6 month period were analysed by PCR, antigen detection ELISA and where possible, cell culture isolation. PCR performed favourably compared with ELISA and cell culture, or with ELISA alone. The PCR assay was especially suited to the detection of C. psittaci DNA in avian faeces samples. The test was also useful when applied to tissue samples from small contact birds associated with a case of human psittacosis where ELISA results were negative and chlamydial isolation was a less favourable method due to the need for rapid diagnosis.
Resumo:
Denaturing high-performance liquid chromatography (DHPLC) was evaluated as a rapid screening and identification method for DNA sequence variation detection in the quinolone resistance-determining region of gyrA from Salmonella serovars. A total of 203 isolates of Salmonella were screened using this method. DHPLC analysis of 14 isolates representing each type of novel or multiple mutations and the wild type were compared with LightCycler-based PCR-gyrA hybridization mutation assay (GAMA) and single-strand conformational polymorphism (SSCP) analyses. The 14 isolates gave seven different SSCP patterns, and LightCycler detected four different mutations. DHPLC detected 11 DNA sequence variants at eight different codons, including those detected by LightCycler or SSCP. One of these mutations was silent. Five isolates contained multiple mutations, and four of these could be distinguished from the composite sequence variants by their DHPLC profile. Seven novel mutations were identified at five different loci not previously described in quinolone-resistant salmonella. DHPLC analysis proved advantageous for the detection of novel and multiple mutations. DHPLC also provides a rapid, high-throughput alternative to LightCycler and SSCP for screening frequently occurring mutations.
Resumo:
Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.