934 resultados para Automatic Calibration
Resumo:
In this work we propose a novel automatic cast iron segmentation approach based on the Optimum-Path Forest classifier (OPF). Microscopic images from nodular, gray and malleable cast irons are segmented using OPF, and Support Vector Machines (SVM) with Radial Basis Function and SVM without kernel mapping. Results show accurate and fast segmented images, in which OPF outperformed SVMs. Our work is the first into applying OPF for automatic cast iron segmentation. © 2010 Springer-Verlag.
Resumo:
The digital image processing has been applied in several areas, especially where it is necessary use tools for feature extraction and to get patterns of the studied images. In an initial stage, the segmentation is used to separate the image in parts that represents a interest object, that may be used in a specific study. There are several methods that intends to perform such task, but is difficult to find a method that can easily adapt to different type of images, that often are very complex or specific. To resolve this problem, this project aims to presents a adaptable segmentation method, that can be applied to different type of images, providing an better segmentation. The proposed method is based in a model of automatic multilevel thresholding and considers techniques of group histogram quantization, analysis of the histogram slope percentage and calculation of maximum entropy to define the threshold. The technique was applied to segment the cell core and potential rejection of tissue in myocardial images of biopsies from cardiac transplant. The results are significant in comparison with those provided by one of the best known segmentation methods available in the literature. © 2010 IEEE.
Resumo:
The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies. © 2010 American Institute of Physics.
Resumo:
The GEANT4 simulations are essential for the development of medical tomography with proton beams pCT. In the case of thin absorbers the latest releases of GEANT4 generate very similar final spectra which agree well with the results of other popular Monte Carlo codes like TRIM/SRIM, or MCNPX. For thick absorbers, however, the disagreements became evident. In a part, these disagreements are due to the known contradictions in the NIST PSTAR and SRIM reference data. Therefore, it is interesting to compare the GEANT4 results with each other, with experiment, and with diverse code results in a reduced form, which is free from this kind of doubts. In this work such comparison is done within the Reduced Calibration Curve concept elaborated for the proton beam tomography. © 2010 IEEE.
Resumo:
Despite the large use of differential scanning calorimetry (DSC) technique in advanced polymer materials characterization, the new methodology called DSC in high heating rates was developed. The heating rate during conventional DSC experiments varying from 10 to 20°C.min-1, sample mass from 10 to 15mg and standard aluminum sample pan weighting, approximately, 27mg. In order to contribute to a better comprehension of DSC behavior in different heating rates, this work correlates as high heating rate influences to the thermal events in DSC experiments. Samples of metallic standard (In, Pb, Sn and Zn) with masses varying from 0.570mg to 20.9mg were analyzed in multiples sample heating rate from 4 to 324°C. min-1. In order to make properly all those experiments, a precise and careful temperature and enthalpy calibrations were performed and deeply discussed. Thus, this work shows a DSC methodology able to generate good and reliable results on experiments under any researcher choice heating rates to characterize the advanced materials used, for example, for aerospace industry. Also it helps the DSC users to find in their available instruments, already installed, a better and more accurate DSC test results, improving in just one shot the analysis sensitivity and resolution. Polypropylene melting and enthalpy thermal events are also studied using both the conventional DSC method and high heating rate method.
Resumo:
This work proposes a methodology for optimized allocation of switches for automatic load transfer in distribution systems in order to improve the reliability indexes by restoring such systems which present voltage classes of 23 to 35 kV and radial topology. The automatic switches must be allocated on the system in order to transfer load remotely among the sources at the substations. The problem of switch allocation is formulated as nonlinear constrained mixed integer programming model subject to a set of economical and physical constraints. A dedicated Tabu Search (TS) algorithm is proposed to solve this model. The proposed methodology is tested for a large real-life distribution system. © 2011 IEEE.
Resumo:
The applications of the Finite Element Method (FEM) for three-dimensional domains are already well documented in the framework of Computational Electromagnetics. However, despite the power and reliability of this technique for solving partial differential equations, there are only a few examples of open source codes available and dedicated to the solid modeling and automatic constrained tetrahedralization, which are the most time consuming steps in a typical three-dimensional FEM simulation. Besides, these open source codes are usually developed separately by distinct software teams, and even under conflicting specifications. In this paper, we describe an experiment of open source code integration for solid modeling and automatic mesh generation. The integration strategy and techniques are discussed, and examples and performance results are given, specially for complicated and irregular volumes which are not simply connected. © 2011 IEEE.
Resumo:
This paper presents three methods for automatic detection of dust devils tracks in images of Mars. The methods are mainly based on Mathematical Morphology and results of their performance are analyzed and compared. A dataset of 21 images from the surface of Mars representative of the diversity of those track features were considered for developing, testing and evaluating our methods, confronting their outputs with ground truth images made manually. Methods 1 and 3, based on closing top-hat and path closing top-hat, respectively, showed similar mean accuracies around 90% but the time of processing was much greater for method 1 than for method 3. Method 2, based on radial closing, was the fastest but showed worse mean accuracy. Thus, this was the tiebreak factor. © 2011 Springer-Verlag.
Resumo:
This paper describes a program for the automatic generation of code for Intel's 8051 microcontroller. The code is generated from a place-transition Petri net specification. Our goal is to minimize programming time. The code generated by our program has been observed to exactly match the net model. It has also been observed that no change is needed to be made to the generated code for its compilation to the target architecture. © 2011 IFAC.
Resumo:
The main goal of the present work is to verify the applicability of the Immersed Boundary Method together with the Virtual Physical Model to solve the flow through automatic valves of hermetic compressors. The valve was simplified to a two-dimensional radial diffuser, with diameter ratio of D/d = 1.5, and simulated for a one cycle of opening and closing process with a imposed velocity of 3.0 cm/s for the reed, dimensionless gap between disks in the range of 0.07 < s/d < 0.10, and inlet Reynolds number equal to 1500. The good results obtained showed that the methodology has great potential as project tool for this type of valve systems. © The Authors, 2011.
Resumo:
The spermatogenesis is crucial to the species reproduction, and its monitoring may shed light over some important information of such process. Thus, the germ cells quantification can provide useful tools to improve the reproduction cycle. In this paper, we present the first work that address this problem in fishes with machine learning techniques. We show here how to obtain high recognition accuracies in order to identify fish germ cells with several state-of-the-art supervised pattern recognition techniques. © 2011 IEEE.
Resumo:
Duplex and superduplex stainless steels are class of materials of a high importance for engineering purposes, since they have good mechanical properties combination and also are very resistant to corrosion. It is known as well that the chemical composition of such steels is very important to maintain some desired properties. In the past years, some works have reported that γ 2 precipitation improves the toughness of such steels, and its quantification may reveals some important information about steel quality. Thus, we propose in this work the automatic segmentation of γ 2 precipitation using two pattern recognition techniques: Optimum-Path Forest (OPF) and a Bayesian classifier. To the best of our knowledge, this if the first time that machine learning techniques are applied into this area. The experimental results showed that both techniques achieved similar and good recognition rates. © 2012 Taylor & Francis Group.
Resumo:
This paper presents a method for indirect orientation of aerial images using ground control lines extracted from airborne Laser system (ALS) data. This data integration strategy has shown good potential in the automation of photogrammetric tasks, including the indirect orientation of images. The most important characteristic of the proposed approach is that the exterior orientation parameters (EOP) of a single or multiple images can be automatically computed with a space resection procedure from data derived from different sensors. The suggested method works as follows. Firstly, the straight lines are automatically extracted in the digital aerial image (s) and in the intensity image derived from an ALS data-set (S). Then, correspondence between s and S is automatically determined. A line-based coplanarity model that establishes the relationship between straight lines in the object and in the image space is used to estimate the EOP with the iterated extended Kalman filtering (IEKF). Implementation and testing of the method have employed data from different sensors. Experiments were conducted to assess the proposed method and the results obtained showed that the estimation of the EOP is function of ALS positional accuracy.
Resumo:
Latent fingerprints are routinely found at crime scenes due to the inadvertent contact of the criminals' finger tips with various objects. As such, they have been used as crucial evidence for identifying and convicting criminals by law enforcement agencies. However, compared to plain and rolled prints, latent fingerprints usually have poor quality of ridge impressions with small fingerprint area, and contain large overlap between the foreground area (friction ridge pattern) and structured or random noise in the background. Accordingly, latent fingerprint segmentation is a difficult problem. In this paper, we propose a latent fingerprint segmentation algorithm whose goal is to separate the fingerprint region (region of interest) from background. Our algorithm utilizes both ridge orientation and frequency features. The orientation tensor is used to obtain the symmetric patterns of fingerprint ridge orientation, and local Fourier analysis method is used to estimate the local ridge frequency of the latent fingerprint. Candidate fingerprint (foreground) regions are obtained for each feature type; an intersection of regions from orientation and frequency features localizes the true latent fingerprint regions. To verify the viability of the proposed segmentation algorithm, we evaluated the segmentation results in two aspects: a comparison with the ground truth foreground and matching performance based on segmented region. © 2012 IEEE.
Resumo:
Image categorization by means of bag of visual words has received increasing attention by the image processing and vision communities in the last years. In these approaches, each image is represented by invariant points of interest which are mapped to a Hilbert Space representing a visual dictionary which aims at comprising the most discriminative features in a set of images. Notwithstanding, the main problem of such approaches is to find a compact and representative dictionary. Finding such representative dictionary automatically with no user intervention is an even more difficult task. In this paper, we propose a method to automatically find such dictionary by employing a recent developed graph-based clustering algorithm called Optimum-Path Forest, which does not make any assumption about the visual dictionary's size and is more efficient and effective than the state-of-the-art techniques used for dictionary generation. © 2012 IEEE.