901 resultados para Classification image technique
Resumo:
In order to evaluate the use of shallow seismic technique to delineate geological and geotechnical features up to 40 meters depth in noisy urban areas covered with asphalt pavement, five survey lines were conducted in the metropolitan area of São Paulo City. The data were acquired using a 24-bit, 24-channel seismograph, 30 and 100 Hz geophones and a sledgehammer-plate system as seismic source. Seismic reflection data were recorded using a CMP (common mid point) acquisition method. The processing routine consisted of: prestack band-pass filtering (90-250 Hz); automatic gain control (AGC); muting (digital zeroin) of dead/noisy traces, ground roll, air-wave and refracted-wave; CMP sorting; velocity analyses; normal move-out corrections; residual static corrections; f-k filtering; CMP stacking. The near surface is geologically characterized by unconsolidated fill materials and Quaternary sediments with organic material overlying Tertiary sediments with the water table 2 to 5 m below the surface. The basement is composed of granite and gneiss. Reflections were observed from 40 milliseconds to 65 ms two-way traveltime and were related to the silt clay layer and fine sand layer contact of the Tertiary sediments and to the weathered basement. The CMP seismic-reflection technique has been shown to be useful for mapping the sedimentary layers and the bedrock of the São Paulo sedimentary basin for the purposes of shallow investigations related to engineering problems. In spite of the strong cultural noise observed in these urban areas and problems with planting geophones we verified that, with the proper equipment, field parameters and particularly great care in data collection and processing, we can overcome the adverse field conditions and to image reflections from layers as shallow as 20 meters.
Resumo:
This paper deals with Joan Robinson's contributions to the issue of technical progress and her attempts of treating this subject in accordance to the Keynesian theory of employment and income distribution, mainly in the long run. This paper aims to review this aspect of her work and to establish a systematisation and a formalisation of her approach. At the same time the paper exposes the problems she faced - and did not always solve. Looking through her main contributions, the paper concludes that she used different criteria for the classification of innovations and that they depended on the specific situations described by the models in which she used the classification.
Resumo:
This paper has as a purpose a new alternative technique development to characterize pitting corrosion evolution, aiming classification, size and determination of morphological parameters which rule pits increase. In this case, the method will be applied to the 304 ABNT stainless steel, thermally treated and exposed to different times in salt spray. Different 304 steel conditions, rolled and thermally treated have exhibited similar geometry, predominantly conic, near-conic and irregular pits. On rolled steel, pits width increases faster than height; however, on thermally treated steel, pits are found to increase in height.
Resumo:
The Ag-NOR staining technique and image analysis were used to evaluate morphological parameters (area, perimeter and axis ratio) in nucleoli from normal thyroids and from thyroids bearing proliferating lesions (carcinomas, adenomas and hyperplasias). Regions with normal appearance located close to adenomatous and carcinomatous regions, in the thyroid of every patient, were also analyzed for comparison with the respective pathological regions and with normal thyroids. Statistical analysis of data for the nucleolar area and perimeter allowed the separation of adenomas and carcinomas from hyperplasias and normal tissue but not the two components in each of these two groups. However, if we look at the numbers, a sequence of increasing nucleolar mean areas in the order: normal, hyperplasia, adenoma and carcinoma may be observed, indicating the sequence of increasing rRNA requirements in these different kinds of cells. The axis ratio that denotes the nucleolar shape (round or oblong) did not show significant differences among tissues, suggesting that shape is not important in the characterization of these pathologies. Differences in nucleolar areas and perimeter between normal and affected regions from each patient were statistically significant for adenomas and carcinomas. When these normal regions were compared with the normal thyroids, significant differences were not obtained in the three evaluated parameters. The observations and their importance for histopathological diagnosis are discussed.
Resumo:
This paper presents a technique for real-time crowd density estimation based on textures of crowd images. In this technique, the current image from a sequence of input images is classified into a crowd density class. Then, the classification is corrected by a low-pass filter based on the crowd density classification of the last n images of the input sequence. The technique obtained 73.89% of correct classification in a real-time application on a sequence of 9892 crowd images. Distributed processing was used in order to obtain real-time performance. © Springer-Verlag Berlin Heidelberg 2005.
Resumo:
Informatics evolution presently offers the possibility of new technique and methodology development for studies in all human knowledge areas. In addition, the present personal computer capacity of handling a large volume of data makes the creation and application of new analysis tools easy. This paper aimed the application of a fuzzy partition matrix to analyze data obtained from the Landsat 5 TMN sensor, in order to elaborate the supervised classification of land use in Arroio das Pombas microbasin in Botucatu, SP, Brazil. It was possible that one single training area present input in more than one covering class due to weight attribution at the signature creation moment. A change in the classification result was also observed when compared to maximum likelihood classification, mainly when related to bigger uniformity and better class edges classification.
Resumo:
Objective: To evaluate the marginal microleakage in enamel and dentin/cementum walls in preparations with a high C-factor, using 3 resin composite insertion techniques. The null hypothesis was that there is no difference among the 3 resin composite insertion techniques. Method and Materials: Standardized Class 5 cavities were prepared in the lingual and buccal aspects of 30 caries-free, extracted third molars. The prepared teeth were randomly assigned to 3 groups: (1) oblique incremental placement technique, (2) horizontal incremental placement technique, and (3) bulk insertion (single increment). The preparations were restored with a 1-bottle adhesive (Single Bond, 3M ESPE) and microhybrid resin composite (Z100, 3M ESPE). Specimens were isolated with nail varnish except for a 2-mm-wide rim around the restoration and thermocycled (1,000 thermal cycles, 5°C/55°C; 30-second dwell time). The specimens were immersed in an aqueous solution of 50 wt% silver nitrate for 24 hours, followed by 8 hours in a photo-developing solution and evaluated for microleakage using an ordinal scale of 0 to 4. The microleakage scores obtained from occlusal and gingival walls were analyzed with Wilcoxon and Kruskal-Wallis nonparametric tests. Results: The null hypothesis was accepted. The horizontal incremental placement technique, the oblique incremental technique, and bulk insertion resulted in statistically similar enamel and dentin microleakage scores. Conclusion: Neither the incremental techniques nor the bulk placement technique were capable of eliminating the marginal microleakage in preparations with a high C-factor.
Resumo:
During the petroleum well drilling operation many mechanical and hydraulic parameters are monitored by an instrumentation system installed in the rig called a mud-logging system. These sensors, distributed in the rig, monitor different operation parameters such as weight on the hook and drillstring rotation. These measurements are known as mud-logging records and allow the online following of all the drilling process with well monitoring purposes. However, in most of the cases, these data are stored without taking advantage of all their potential. On the other hand, to make use of the mud-logging data, an analysis and interpretationt is required. That is not an easy task because of the large volume of information involved. This paper presents a Support Vector Machine (SVM) used to automatically classify the drilling operation stages through the analysis of some mud-logging parameters. In order to validate the results of SVM technique, it was compared to a classification elaborated by a Petroleum Engineering expert. © 2006 IEEE.
Resumo:
This paper presents a method to enhance microcalcifications and classify their borders by applying the wavelet transform. Decomposing an image and removing its low frequency sub-band the microcalcifications are enhanced. Analyzing the effects of perturbations on high frequency subband it's possible to classify its borders as smooth, rugged or undefined. Results show a false positive reduction of 69.27% using a region growing algorithm. © 2008 IEEE.
Resumo:
Malware has become a major threat in the last years due to the ease of spread through the Internet. Malware detection has become difficult with the use of compression, polymorphic methods and techniques to detect and disable security software. Those and other obfuscation techniques pose a problem for detection and classification schemes that analyze malware behavior. In this paper we propose a distributed architecture to improve malware collection using different honeypot technologies to increase the variety of malware collected. We also present a daemon tool developed to grab malware distributed through spam and a pre-classification technique that uses antivirus technology to separate malware in generic classes. © 2009 SPIE.
Resumo:
Most of the tasks in genome annotation can be at least partially automated. Since this annotation is time-consuming, facilitating some parts of the process - thus freeing the specialist to carry out more valuable tasks - has been the motivation of many tools and annotation environments. In particular, annotation of protein function can benefit from knowledge about enzymatic processes. The use of sequence homology alone is not a good approach to derive this knowledge when there are only a few homologues of the sequence to be annotated. The alternative is to use motifs. This paper uses a symbolic machine learning approach to derive rules for the classification of enzymes according to the Enzyme Commission (EC). Our results show that, for the top class, the average global classification error is 3.13%. Our technique also produces a set of rules relating structural to functional information, which is important to understand the protein tridimensional structure and determine its biological function. © 2009 Springer Berlin Heidelberg.
Resumo:
In this paper a new partial differential equation based method is presented with a view to denoising images having textures. The proposed model combines a nonlinear anisotropic diffusion filter with recent harmonic analysis techniques. A wave atom shrinkage allied to detection by gradient technique is used to guide the diffusion process so as to smooth and maintain essential image characteristics. Two forcing terms are used to maintain and improve edges, boundaries and oscillatory features of an image having irregular details and texture. Experimental results show the performance of our model for texture preserving denoising when compared to recent methods in literature. © 2009 IEEE.
Resumo:
Traditional pattern recognition techniques can not handle the classification of large datasets with both efficiency and effectiveness. In this context, the Optimum-Path Forest (OPF) classifier was recently introduced, trying to achieve high recognition rates and low computational cost. Although OPF was much faster than Support Vector Machines for training, it was slightly slower for classification. In this paper, we present the Efficient OPF (EOPF), which is an enhanced and faster version of the traditional OPF, and validate it for the automatic recognition of white matter and gray matter in magnetic resonance images of the human brain. © 2010 IEEE.
Resumo:
In this work, signal processing techniques are used to improve the quality of image based on multi-element synthetic aperture techniques. Using several apodization functions to obtain different side lobes distribution, a polarity function and a threshold criterium are used to develop an image compounding technique. The spatial diversity is increased using an additional array, which generates complementary information about the defects, improving the results of the proposed algorithm and producing high resolution and contrast images. The inspection of isotropic plate-like structures using linear arrays and Lamb waves is presented. Experimental results are shown for a 1-mm-thick isotropic aluminum plate with artificial defects using linear arrays formed by 30 piezoelectric elements, with the low dispersion symmetric mode S0 at the frequency of 330 kHz. © 2011 American Institute of Physics.