55 resultados para rule-based algorithms
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Petroleum well drilling is an expensive and risky operation. In this context, well design presents itself as a fundamental key to decrease costs and risks involved. Experience acquired by engineers is notably an important factor in good drilling design elaborations. Therefore, the loss of this knowledge may entail additional problems and costs. In this way, this work represents an initiative to model a petroleum well design case-based architecture. Tests with a prototype showed that the system built with this architecture may help in a well design and enable corporate knowledge preservation. (C) 2003 Elsevier B.V. B.V. All rights reserved.
Resumo:
The evaluation of free carrier concentration based on Drude's theory can be performed by the use of optical transmittance in the range 800-2000 nm (near infrared) for Sb-doped SnO2 thin films. In this article, we estimate the free carrier concentration for these films, which are deposited via sol-gel dip-coating. At approximately 900 mn, there is a separation among transmittance curves of doped and undoped samples. The plasma resonance phenomena approach leads to free carrier concentration of about 5 x 1020 cm(-3). The increase in the Sb concentration increases the film conductivity; however, the magnitude of measured resistivity is still very high. The only way to combine such a high free carrier concentration with a rather low conductivity is to have a very low mobility. It becomes possible when the crystallite dimensions are taken into account. We obtain grains with 5 nm of average size by estimating the grain size from X-ray diffraction data, and by using line broadening in the diffraction pattern. The low conductivity is due to very intense scattering at the grain boundary, which is created by the presence of a large amount of nanoscopic crystallites. Such a result is in accordance with X-ray photoemission spectroscopy data that pointed to Sb incorporation proportional to the free electron concentration, evaluated according to Drude's model. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
To enhance the global search ability of population based incremental learning (PBIL) methods, it is proposed that multiple probability vectors are to be included on available PBIL algorithms. The strategy for updating those probability vectors and the negative learning and mutation operators are thus re-defined correspondingly. Moreover, to strike the best tradeoff between exploration and exploitation searches, an adaptive updating strategy for the learning rate is designed. Numerical examples are reported to demonstrate the pros and cons of the newly implemented algorithm.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Accurate long-term monitoring of total ozone is one of the most important requirements for identifying possible natural or anthropogenic changes in the composition of the stratosphere. For this purpose, the NDACC (Network for the Detection of Atmospheric Composition Change) UV-visible Working Group has made recommendations for improving and homogenizing the retrieval of total ozone columns from twilight zenith-sky visible spectrometers. These instruments, deployed all over the world in about 35 stations, allow measuring total ozone twice daily with limited sensitivity to stratospheric temperature and cloud cover. The NDACC recommendations address both the DOAS spectral parameters and the calculation of air mass factors (AMF) needed for the conversion of O-3 slant column densities into vertical column amounts. The most important improvement is the use of O-3 AMF look-up tables calculated using the TOMS V8 (TV8) O-3 profile climatology, that allows accounting for the dependence of the O-3 AMF on the seasonal and latitudinal variations of the O-3 vertical distribution. To investigate their impact on the retrieved ozone columns, the recommendations have been applied to measurements from the NDACC/SAOZ (Systeme d'Analyse par Observation Zenithale) network. The revised SAOZ ozone data from eight stations deployed at all latitudes have been compared to TOMS, GOMEGDP4, SCIAMACHY-TOSOMI, SCIAMACHY-OL3, OMI-TOMS, and OMI-DOAS satellite overpass observations, as well as to those of collocated Dobson and Brewer instruments at Observatoire de Haute Provence (44 degrees N, 5.5 degrees E) and Sodankyla (67 degrees N, 27 degrees E), respectively. A significantly better agreement is obtained between SAOZ and correlative reference ground-based measurements after applying the new O-3 AMFs. However, systematic seasonal differences between SAOZ and satellite instruments remain. These are shown to mainly originate from (i) a possible problem in the satellite retrieval algorithms in dealing with the temperature dependence of the ozone cross-sections in the UV and the solar zenith angle (SZA) dependence, (ii) zonal modulations and seasonal variations of tropospheric ozone columns not accounted for in the TV8 profile climatology, and (iii) uncertainty on the stratospheric ozone profiles at high latitude in the winter in the TV8 climatology. For those measurements mostly sensitive to stratospheric temperature like TOMS, OMI-TOMS, Dobson and Brewer, or to SZA like SCIAMACHY-TOSOMI, the application of temperature and SZA corrections results in the almost complete removal of the seasonal difference with SAOZ, improving significantly the consistency between all ground-based and satellite total ozone observations.
Resumo:
Practical methods for land grading design of a plane surface for rectangular and irregularly shaped fields based on a least squares analysis are presented. The least squares procedure leads to a system of three linear equations with three unknowns for determination of the best-fit plane. The equations can be solved by determinants (Cramer's rule) using a procedure capable of solution by many programmable calculators. The detailed computational process for determining the equation of the plane and a simple method to find the centroid location of an irregular field are also given. An illustrative example and design instructions are included to demonstrate the application of the design procedure.
Resumo:
This paper describes two solutions for systematic measurement of surface elevation that can be used for both profile and surface reconstructions for quantitative fractography case studies. The first one is developed under Khoros graphical interface environment. It consists of an adaption of the almost classical area matching algorithm, that is based on cross-correlation operations, to the well-known method of parallax measurements from stereo pairs. A normalization function was created to avoid false cross-correlation peaks, driving to the true window best matching solution at each region analyzed on both stereo projections. Some limitations to the use of scanning electron microscopy and the types of surface patterns are also discussed. The second algorithm is based on a spatial correlation function. This solution is implemented under the NIH Image macro programming, combining a good representation for low contrast regions and many improvements on overall user interface and performance. Its advantages and limitations are also presented.
Resumo:
This article presents a quantitative and objective approach to cat ganglion cell characterization and classification. The combination of several biologically relevant features such as diameter, eccentricity, fractal dimension, influence histogram, influence area, convex hull area, and convex hull diameter are derived from geometrical transforms and then processed by three different clustering methods (Ward's hierarchical scheme, K-means and genetic algorithm), whose results are then combined by a voting strategy. These experiments indicate the superiority of some features and also suggest some possible biological implications.
Resumo:
In this article we describe a feature extraction algorithm for pattern classification based on Bayesian Decision Boundaries and Pruning techniques. The proposed method is capable of optimizing MLP neural classifiers by retaining those neurons in the hidden layer that realy contribute to correct classification. Also in this article we proposed a method which defines a plausible number of neurons in the hidden layer based on the stem-and-leaf graphics of training samples. Experimental investigation reveals the efficiency of the proposed method. © 2002 IEEE.
Resumo:
Three-phase three-wire power flow algorithms, as any tool for power systems analysis, require reliable impedances and models in order to obtain accurate results. Kron's reduction procedure, which embeds neutral wire influence into phase wires, has shown good results when three-phase three-wire power flow algorithms based on current summation method were used. However, Kron's reduction can harm reliabilities of some algorithms whose iterative processes need loss calculation (power summation method). In this work, three three-phase three-wire power flow algorithms based on power summation method, will be compared with a three-phase four-wire approach based on backward-forward technique and current summation. Two four-wire unbalanced medium-voltage distribution networks will be analyzed and results will be presented and discussed. © 2004 IEEE.
Resumo:
An analysis of the performances of three important methods for generators and loads loss allocation is presented. The discussed methods are: based on pro-rata technique; based on the incremental technique; and based on matrices of circuit. The algorithms are tested considering different generation conditions, using a known electric power system: IEEE 14 bus. Presented and discussed results verify: the location and the magnitude of generators and loads; the possibility to have agents well or poorly located in each network configuration; the discriminatory behavior considering variations in the power flow in the transmission lines. © 2004 IEEE.
Resumo:
To enhance the global search ability of Population Based Incremental Learning (PBIL) methods, It Is proposed that multiple probability vectors are to be Included on available PBIL algorithms. As a result, the strategy for updating those probability vectors and the negative learning and mutation operators are redefined as reported. Numerical examples are reported to demonstrate the pros and cons of the newly Implemented algorithm. ©2006 IEEE.
Resumo:
When the food supply flnishes, or when the larvae of blowflies complete their development and migrate prior to the total removal of the larval substrate, they disperse to find adequate places for pupation, a process known as post-feeding larval dispersal. Based on experimental data of the Initial and final configuration of the dispersion, the reproduction of such spatio-temporal behavior is achieved here by means of the evolutionary search for cellular automata with a distinct transition rule associated with each cell, also known as a nonuniform cellular automata, and with two states per cell in the lattice. Two-dimensional regular lattices and multivalued states will be considered and a practical question is the necessity of discovering a proper set of transition rules. Given that the number of rules is related to the number of cells in the lattice, the search space is very large and an evolution strategy is then considered to optimize the parameters of the transition rules, with two transition rules per cell. As the parameters to be optimized admit a physical interpretation, the obtained computational model can be analyzed to raise some hypothetical explanation of the observed spatiotemporal behavior. © 2006 IEEE.