942 resultados para Feature Point Detection
Resumo:
Frequency Response Analysis is a well-known technique for the diagnosis of power transformers. Currently, this technique is under research for its application in rotary electrical machines. This paper presents significant results on the application of Frequency Response Analysis to fault detection in field winding of synchronous machines with static excitation. First, the influence of the rotor position on the frequency response is evaluated. Secondly, some relevant test results are shown regarding ground fault and inter-turn fault detection in field windings at standstill condition. The influence of the fault resistance value is also taken into account. This paper also studies the applicability of Frequency Response Analysis in fault detection in field windings while rotating. This represents an important feature because some defects only appear with the machine rated speed. Several laboratory test results show the applicability of this fault detection technique in field windings at full speed with no excitation current.
Resumo:
It has long been known that cholera outbreaks can be initiated when Vibrio cholerae, the bacterium that causes cholera, is present in drinking water in sufficient numbers to constitute an infective dose, if ingested by humans. Outbreaks associated with drinking or bathing in unpurified river or brackish water may directly or indirectly depend on such conditions as water temperature, nutrient concentration, and plankton production that may be favorable for growth and reproduction of the bacterium. Although these environmental parameters have routinely been measured by using water samples collected aboard research ships, the available data sets are sparse and infrequent. Furthermore, shipboard data acquisition is both expensive and time-consuming. Interpolation to regional scales can also be problematic. Although the bacterium, V. cholerae, cannot be sensed directly, remotely sensed data can be used to infer its presence. In the study reported here, satellite data were used to monitor the timing and spread of cholera. Public domain remote sensing data for the Bay of Bengal were compared directly with cholera case data collected in Bangladesh from 1992–1995. The remote sensing data included sea surface temperature and sea surface height. It was discovered that sea surface temperature shows an annual cycle similar to the cholera case data. Sea surface height may be an indicator of incursion of plankton-laden water inland, e.g., tidal rivers, because it was also found to be correlated with cholera outbreaks. The extensive studies accomplished during the past 25 years, confirming the hypothesis that V. cholerae is autochthonous to the aquatic environment and is a commensal of zooplankton, i.e., copepods, when combined with the findings of the satellite data analyses, provide strong evidence that cholera epidemics are climate-linked.
Resumo:
The observation of light metal ions in nucleic acids crystals is generally a fortuitous event. Sodium ions in particular are notoriously difficult to detect because their X-ray scattering contributions are virtually identical to those of water and Na+…O distances are only slightly shorter than strong hydrogen bonds between well-ordered water molecules. We demonstrate here that replacement of Na+ by K+, Rb+ or Cs+ and precise measurements of anomalous differences in intensities provide a particularly sensitive method for detecting alkali metal ion-binding sites in nucleic acid crystals. Not only can alkali metal ions be readily located in such structures, but the presence of Rb+ or Cs+ also allows structure determination by the single wavelength anomalous diffraction technique. Besides allowing identification of high occupancy binding sites, the combination of high resolution and anomalous diffraction data established here can also pinpoint binding sites that feature only partial occupancy. Conversely, high resolution of the data alone does not necessarily allow differentiation between water and partially ordered metal ions, as demonstrated with the crystal structure of a DNA duplex determined to a resolution of 0.6 Å.
Resumo:
An unusual feature of the mammalian genome is the number of genes exhibiting monoallelic expression. Recently random monoallelic expression of autosomal genes has been reported for olfactory and Ly-49 NK receptor genes, as well as for Il-2, Il-4 and Pax5. RNA fluorescence in situ hybridization (FISH) has been exploited to monitor allelic expression by visualizing the number of sites of transcription in individual nuclei. However, the sensitivity of this technique is difficult to determine for a given gene. We show that by combining DNA and RNA FISH it is possible to control for the hybridization efficiency and the accessibility and visibility of fluorescent probes within the nucleus.
Resumo:
A method was developed to detect 5' ends of bacterial RNAs expressed at low levels and to differentiate newly initiated transcripts from processed transcripts produced in vivo. The procedure involves use of RNA ligase to link a specific oligoribonucleotide to the 5' ends of cellular RNAs, followed by production of cDNA and amplification of the gene of interest by PCR. The method was used to identify the precise sites of transcription initiation within a 10-kb region of the pheromone-inducible conjugative plasmid pCF10 of Enterococcus faecalis. Results confirmed the 5' end of a very abundant, constitutively produced transcript (from prgQ) that had been mapped previously by primer extension and defined the initiation point of a less abundant, divergently transcribed message (from prgX). The method also showed that the 5' end of a pheromone-inducible transcript (prgB) that had been mapped by primer extension was generated by processing rather than new initiation. In addition, the results provided evidence for two promoters, 3 and 5 kb upstream of prgB, and indicated that only the transcripts originating 5 kb upstream may be capable of extending to prgB.
Resumo:
A distribution of tumor size at detection is derived within the framework of a mechanistic model of carcinogenesis with the object of estimating biologically meaningful parameters of tumor latency. Its limiting form appears to be a generalization of the distribution that arises in the length-biased sampling from stationary point processes. The model renders the associated estimation problems tractable. The usefulness of the proposed approach is illustrated with an application to clinical data on premenopausal breast cancer.
Resumo:
This paper presents a preliminary study in which Machine Learning experiments applied to Opinion Mining in blogs have been carried out. We created and annotated a blog corpus in Spanish using EmotiBlog. We evaluated the utility of the features labelled firstly carrying out experiments with combinations of them and secondly using the feature selection techniques, we also deal with several problems, such as the noisy character of the input texts, the small size of the training set, the granularity of the annotation scheme and the language object of our study, Spanish, with less resource than English. We obtained promising results considering that it is a preliminary study.
Resumo:
Several recent works deal with 3D data in mobile robotic problems, e.g. mapping or egomotion. Data comes from any kind of sensor such as stereo vision systems, time of flight cameras or 3D lasers, providing a huge amount of unorganized 3D data. In this paper, we describe an efficient method to build complete 3D models from a Growing Neural Gas (GNG). The GNG is applied to the 3D raw data and it reduces both the subjacent error and the number of points, keeping the topology of the 3D data. The GNG output is then used in a 3D feature extraction method. We have performed a deep study in which we quantitatively show that the use of GNG improves the 3D feature extraction method. We also show that our method can be applied to any kind of 3D data. The 3D features obtained are used as input in an Iterative Closest Point (ICP)-like method to compute the 6DoF movement performed by a mobile robot. A comparison with standard ICP is performed, showing that the use of GNG improves the results. Final results of 3D mapping from the egomotion calculated are also shown.
Resumo:
Rock mass characterization requires a deep geometric understanding of the discontinuity sets affecting rock exposures. Recent advances in Light Detection and Ranging (LiDAR) instrumentation currently allow quick and accurate 3D data acquisition, yielding on the development of new methodologies for the automatic characterization of rock mass discontinuities. This paper presents a methodology for the identification and analysis of flat surfaces outcropping in a rocky slope using the 3D data obtained with LiDAR. This method identifies and defines the algebraic equations of the different planes of the rock slope surface by applying an analysis based on a neighbouring points coplanarity test, finding principal orientations by Kernel Density Estimation and identifying clusters by the Density-Based Scan Algorithm with Noise. Different sources of information —synthetic and 3D scanned data— were employed, performing a complete sensitivity analysis of the parameters in order to identify the optimal value of the variables of the proposed method. In addition, raw source files and obtained results are freely provided in order to allow to a more straightforward method comparison aiming to a more reproducible research.
Resumo:
Outliers are objects that show abnormal behavior with respect to their context or that have unexpected values in some of their parameters. In decision-making processes, information quality is of the utmost importance. In specific applications, an outlying data element may represent an important deviation in a production process or a damaged sensor. Therefore, the ability to detect these elements could make the difference between making a correct and an incorrect decision. This task is complicated by the large sizes of typical databases. Due to their importance in search processes in large volumes of data, researchers pay special attention to the development of efficient outlier detection techniques. This article presents a computationally efficient algorithm for the detection of outliers in large volumes of information. This proposal is based on an extension of the mathematical framework upon which the basic theory of detection of outliers, founded on Rough Set Theory, has been constructed. From this starting point, current problems are analyzed; a detection method is proposed, along with a computational algorithm that allows the performance of outlier detection tasks with an almost-linear complexity. To illustrate its viability, the results of the application of the outlier-detection algorithm to the concrete example of a large database are presented.
Resumo:
The complete characterization of rock masses implies the acquisition of information of both, the materials which compose the rock mass and the discontinuities which divide the outcrop. Recent advances in the use of remote sensing techniques – such as Light Detection and Ranging (LiDAR) – allow the accurate and dense acquisition of 3D information that can be used for the characterization of discontinuities. This work presents a novel methodology which allows the calculation of the normal spacing of persistent and non-persistent discontinuity sets using 3D point cloud datasets considering the three dimensional relationships between clusters. This approach requires that the 3D dataset has been previously classified. This implies that discontinuity sets are previously extracted, every single point is labeled with its corresponding discontinuity set and every exposed planar surface is analytically calculated. Then, for each discontinuity set the method calculates the normal spacing between an exposed plane and its nearest one considering 3D space relationship. This link between planes is obtained calculating for every point its nearest point member of the same discontinuity set, which provides its nearest plane. This allows calculating the normal spacing for every plane. Finally, the normal spacing is calculated as the mean value of all the normal spacings for each discontinuity set. The methodology is validated through three cases of study using synthetic data and 3D laser scanning datasets. The first case illustrates the fundamentals and the performance of the proposed methodology. The second and the third cases of study correspond to two rock slopes for which datasets were acquired using a 3D laser scanner. The second case study has shown that results obtained from the traditional and the proposed approaches are reasonably similar. Nevertheless, a discrepancy between both approaches has been found when the exposed planes members of a discontinuity set were hard to identify and when the planes pairing was difficult to establish during the fieldwork campaign. The third case study also has evidenced that when the number of identified exposed planes is high, the calculated normal spacing using the proposed approach is minor than those using the traditional approach.
Resumo:
This dataset consists of 2D footprints of the buildings in the metropolitan Boston area, based on tiles in the orthoimage index (orthophoto quad ID: 229890, 229894, 229898, 229902, 233886, 233890, 233894, 233898, 233902, 237890, 237894, 237898, 237902, 241890, 241894, 241898, 241902, 245898, 245902). This data set was collected using 3Di's Digital Airborne Topographic Imaging System II (DATIS II). Roof height and footprint elevation attributes (derived from 1-meter resolution LIDAR (LIght Detection And Ranging) data) are included as part of each building feature. This data can be combined with other datasets to create 3D representations of buildings and the surrounding environment.
Resumo:
Emulsion detectors feature a very high position resolution and consequently represent an ideal device when particle detection is required at the micrometric scale. This is the case of quantum interferometry studies with antimatter, where micrometric fringes have to be measured. In this framework, we designed and realized a new emulsion based detector characterized by a gel enriched in terms of silver bromide crystal contents poured on a glass plate. We tested the sensitivity of such a detector to low energy positrons in the range 10–20 keV . The obtained results prove that nuclear emulsions are highly efficient at detecting positrons at these energies. This achievement paves the way to perform matter-wave interferometry with positrons using this technology.
Resumo:
Considering the importance of the proper detection of bubbles in financial markets for policymakers and market agents, we used two techniques described in Diba and Grossman (1988b) and in Phillips, Shi, and Yu (2015) to detect periods of exuberance in the recent history of the Brazillian stock market. First, a simple cointegration test is applied. Secondly, we conducted several augmented, right-tailed Dickey-Fuller tests on rolling windows of data to determine the point in which there’s a structural break and the series loses its stationarity.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-04