838 resultados para Automated algorithms
Resumo:
This paper presents an automatic methodology for road network extraction from medium-and high-resolution aerial images. It is based on two steps. In the first step, the road seeds (i.e., road segments) are extracted using a set of four road objects and another set of connection rules among road objects. Each road object is a local representation of an approximately straight road fragment and its construction is based on a combination of polygons describing all relevant image edges, according to some rules embodying road knowledge. Each road seed is composed by a sequence of connected road objects in which each sequence of this type can be geometrically structured as a chain of contiguous quadrilaterals. In the second step, two strategies for road completion are applied in order to generate the complete road network. The first strategy is based on two basic perceptual grouping rules, i.e., proximity and collinearity rules, which allow the sequential reconstruction of gaps between every pair of disconnected road segments. This strategy does not allow the reconstruction of road crossings, but it allows the extraction of road centerlines from the contiguous quadrilaterals representing connected road segments. The second strategy for road completion aims at reconstructing road crossings. Firstly, the road centerlines are used to find reference points for road crossings, which are their approximate positions. Then these points are used to extract polygons representing the contours of road crossings. This paper presents the proposed methodology and experimental results. © Pleiades Publishing, Inc. 2006.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.
Resumo:
This paper discusses the main characteristics and presents a comparative analysis of three synchronization algorithms based respectively, on a Phase-Locked Loop, a Kalman Filter and a Discrete Fourier Transform. It will be described the single and three-phase models of the first two methods and the single-phase model of the third one. Details on how to modify the filtering properties or dynamic response of each algorithm will be discussed in terms of their design parameters. In order to compare the different algorithms, these parameters will be set for maximum filter capability. Then, the dynamic response, during input amplitude and frequency deviations will be observed, as well as during the initialization procedure. So, advantages and disadvantages of all considered algorithms will be discussed. ©2007 IEEE.
Resumo:
This paper discusses two pitch detection algorithms (PDA) for simple audio signals which are based on zero-cross rate (ZCR) and autocorrelation function (ACF). As it is well known, pitch detection methods based on ZCR and ACF are widely used in signal processing. This work shows some features and problems in using these methods, as well as some improvements developed to increase their performance. © 2008 IEEE.
Resumo:
This paper studies the use of different population structures in a Genetic Algorithm (GA) applied to lot sizing and scheduling problems. The population approaches are divided into two types: single-population and multi-population. The first type has a non-structured single population. The multi-population type presents non-structured and structured populations organized in binary and ternary trees. Each population approach is tested on lot sizing and scheduling problems found in soft drink companies. These problems have two interdependent levels with decisions concerning raw material storage and soft drink bottling. The challenge is to simultaneously determine the lot sizing and scheduling of raw materials in tanks and products in lines. Computational results are reported allowing determining the better population structure for the set of problem instances evaluated. Copyright 2008 ACM.
Resumo:
The present study evaluated by cone-beam computed tomography (CBCT) the apical canal transportation and centralizing ability of different automated systems after root canal preparation. The mesiobuccal canals of maxillary first molars (n=10 per group) were prepared with: GI - reciprocating system with K-Flexofile; GII - reciprocating system with NiTiFlex files; GIII - rotary system with K3 instruments; GIV - rotary system with RaCe instruments. CBCT scans were taken before and after biomechanical preparation up to a #40.02 diameter. Canal transportation was determined by measuring the smallest distance between the inner canal walls and the mesial and distal sides of the root. The centralization ability corresponded to the difference between the measurements from transportation evaluation, using the linear voxel to voxel method of analysis. The mean transportation was 0.06 ± 0.14 mm, with a tendency to deviate to the mesial side of the root (n=22), with no statistically significant difference among the groups (p=0.4153). The mean centralization index was 0.15 ± 0.65 also without statistically significant difference among the groups (p=0.0881). It may be concluded that apical canal transportation and centralization ability were not influenced by the type of mechanical movement and instruments used.
Resumo:
Since Sharir and Pnueli, algorithms for context-sensitivity have been defined in terms of 'valid' paths in an interprocedural flow graph. The definition of valid paths requires atomic call and ret statements, and encapsulated procedures. Thus, the resulting algorithms are not directly applicable when behavior similar to call and ret instructions may be realized using non-atomic statements, or when procedures do not have rigid boundaries, such as with programs in low level languages like assembly or RTL. We present a framework for context-sensitive analysis that requires neither atomic call and ret instructions, nor encapsulated procedures. The framework presented decouples the transfer of control semantics and the context manipulation semantics of statements. A new definition of context-sensitivity, called stack contexts, is developed. A stack context, which is defined using trace semantics, is more general than Sharir and Pnueli's interprocedural path based calling-context. An abstract interpretation based framework is developed to reason about stack-contexts and to derive analogues of calling-context based algorithms using stack-context. The framework presented is suitable for deriving algorithms for analyzing binary programs, such as malware, that employ obfuscations with the deliberate intent of defeating automated analysis. The framework is used to create a context-sensitive version of Venable et al.'s algorithm for analyzing x86 binaries without requiring that a binary conforms to a standard compilation model for maintaining procedures, calls, and returns. Experimental results show that a context-sensitive analysis using stack-context performs just as well for programs where the use of Sharir and Pnueli's calling-context produces correct approximations. However, if those programs are transformed to use call obfuscations, a contextsensitive analysis using stack-context still provides the same, correct results and without any additional overhead. © Springer Science+Business Media, LLC 2011.
Resumo:
This paper presents the generation of optimal trajectories by genetic algorithms (GA) for a planar robotic manipulator. The implemented GA considers a multi-objective function that minimizes the end-effector positioning error together with the joints angular displacement and it solves the inverse kinematics problem for the trajectory. Computer simulations results are presented to illustrate this implementation and show the efficiency of the used methodology producing soft trajectories with low computing cost. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
The multi-relational Data Mining approach has emerged as alternative to the analysis of structured data, such as relational databases. Unlike traditional algorithms, the multi-relational proposals allow mining directly multiple tables, avoiding the costly join operations. In this paper, is presented a comparative study involving the traditional Patricia Mine algorithm and its corresponding multi-relational proposed, MR-Radix in order to evaluate the performance of two approaches for mining association rules are used for relational databases. This study presents two original contributions: the proposition of an algorithm multi-relational MR-Radix, which is efficient for use in relational databases, both in terms of execution time and in relation to memory usage and the presentation of the empirical approach multirelational advantage in performance over several tables, which avoids the costly join operations from multiple tables. © 2011 IEEE.
Resumo:
This paper presents vectorized methods of construction and descent of quadtrees that can be easily adapted to message passing parallel computing. A time complexity analysis for the present approach is also discussed. The proposed method of tree construction requires a hash table to index nodes of a linear quadtree in the breadth-first order. The hash is performed in two steps: an internal hash to index child nodes and an external hash to index nodes in the same level (depth). The quadtree descent is performed by considering each level as a vector segment of a linear quadtree, so that nodes of the same level can be processed concurrently. © 2012 Springer-Verlag.
Resumo:
The correct classification of sugar according to its physico-chemical characteristics directly influences the value of the product and its acceptance by the market. This study shows that using an electronic tongue system along with established techniques of supervised learning leads to the correct classification of sugar samples according to their qualities. In this paper, we offer two new real, public and non-encoded sugar datasets whose attributes were automatically collected using an electronic tongue, with and without pH controlling. Moreover, we compare the performance achieved by several established machine learning methods. Our experiments were diligently designed to ensure statistically sound results and they indicate that k-nearest neighbors method outperforms other evaluated classifiers and, hence, it can be used as a good baseline for further comparison. © 2012 IEEE.
Resumo:
This paper presents a novel approach to the computed assessment of a mammographic phantom device. The approach shown here is fully automated and is based on the automatic selection of the region of interest, in the use of the discrete wavelet transform (DWT) and morphological operators to assess the quality of the American College of Radiology (ACR) mammographic phantom images. The algorithms developed here have succesfully scored 30 images obtained with different combinations of voltage applied to the tube and exposure and could notice the differences in the radiographs due to the different level of exposure to radiation. © 2013 Springer-Verlag.
Resumo:
Obtaining a semi-automatic quantification of pathologies found in the lung, through images of high resolution computed tomography (HRCT), is of great importance to aid in medical diagnosis. Paraccocidioidomycosis (PCM) is a systemic disease that affects the lung and even after effective treatment leaves sequels such as pulmonary fibrosis and emphysema. It is very important to the area of tropical diseases that the lung injury be quantified more accurately. In this stud, we propose the development of algorithms in computational environment Matlab® able to objectively quantify lung diseases such as fibrosis and emphysema. The program consists in selecting the region of interest (ROI), and through the use of density masks and filters, obtaining the lesion area quantification in relation to the healthy area of the lung. The proposed method was tested on 15 exams of HRCT of patients with confirmed PCM. To prove the validity and effectiveness of the method, we used a virtual phantom, also developed in this research. © 2013 Springer-Verlag.