886 resultados para Search-based technique
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper presents a new approach for damage detection in structural health monitoring systems exploiting the coherence function between the signals from PZT (Lead Zirconate Titanate) transducers bonded to a host structure. The physical configuration of this new approach is similar to the configuration used in Lamb wave based methods, but the analysis and operation are different. A PZT excited by a signal with a wide frequency range acts as an actuator and others PZTs are used as sensors to receive the signal. The coherences between the signals from the PZT sensors are obtained and the standard deviation for each coherence function is computed. It is demonstrated through experimental results that the standard deviation of the coherence between the signals from the PZTs in healthy and damaged conditions is a very sensitive metric index to detect damage. Tests were carried out on an aluminum plate and the results show that the proposed methodology could be an excellent approach for structural health monitoring (SHM) applications.
Resumo:
Sugar is widely consumed worldwide and Brazil is the largest producer, consumer, and exporter of this product. To guarantee proper development and productivity of sugar cane crops, it is necessary to apply large quantities of agrochemicals, especially herbicides and pesticides. The herbicide tebuthiuron (TBH) prevents pre- and post-emergence of infesting weed in sugarcane cultures. Considering that it is important to ensure food safety for the population, this paper proposes a reliable method to analyse TBH in sugar matrixes (brown and crystal) using square wave voltammetry (SWV) and differential pulse voltammetry (DPV) at bare glassy carbon electrode and investigate the electrochemical behavior of this herbicide by cyclic voltammetry (CV). Our results suggest that TBH or the product of its reaction with a supporting electrolyte is oxidized through irreversible transfer of one electron between the analyte and the working electrode, at a potential close to +1.16 V vs. Ag |AgClsat in 0.10 mol L-1 KOH as supporting electrolyte solution. Both DPV and SWV are satisfactory for the quantitative analysis of the analyte. DPV is more sensitive and selective, with detection limits of 0.902, 0.815 and 0.578 mg kg-1, and quantification limits of 0.009, 0.010 and 0.008 mg kg-1 in the absence of the matrix and in the presence of crystal and brown sugar matrix, respectively. Repeatability lay between 0.53 and 13.8%, precision ranged between 4.14 and 15.0%, and recovery remained between 84.2 and 113% in the case of DPV conducted in the absence of matrix and in the presence of the crystal sugar matrix, respectively.
Resumo:
The electromechanical impedance (EMI) technique has been successfully used in structural health monitoring (SHM) systems on a wide variety of structures. The basic concept of this technique is to monitor the structural integrity by exciting and sensing a piezoelectric transducer, usually a lead zirconate titanate (PZT) wafer bonded to the structure to be monitored and excited in a suitable frequency range. Because of the piezoelectric effect, there is a relationship between the mechanical impedance of the host structure, which is directly related to its integrity, and the electrical impedance of the PZT transducer, obtained by a ratio between the excitation and the sensing signals.This work presents a study on damage (leaks) detection using EMI based method. Tests were carried out in a rig water system built in a Hydraulic Laboratory for different leaks conditions in a metallic pipeline. Also, it was evaluated the influence of the PZT position bonded to the pipeline. The results show that leaks can effectively be detected using common metrics for damage detection such as RMSD and CCDM. Further, it was observed that the position of the PZT bonded to the pipes is an important variable and has to be controlled.
Resumo:
Decision tree induction algorithms represent one of the most popular techniques for dealing with classification problems. However, traditional decision-tree induction algorithms implement a greedy approach for node splitting that is inherently susceptible to local optima convergence. Evolutionary algorithms can avoid the problems associated with a greedy search and have been successfully employed to the induction of decision trees. Previously, we proposed a lexicographic multi-objective genetic algorithm for decision-tree induction, named LEGAL-Tree. In this work, we propose extending this approach substantially, particularly w.r.t. two important evolutionary aspects: the initialization of the population and the fitness function. We carry out a comprehensive set of experiments to validate our extended algorithm. The experimental results suggest that it is able to outperform both traditional algorithms for decision-tree induction and another evolutionary algorithm in a variety of application domains.
Resumo:
The promise of search-driven development is that developers will save time and resources by reusing external code in their local projects. To efficiently integrate this code, users must be able to trust it, thus trustability of code search results is just as important as their relevance. In this paper, we introduce a trustability metric to help users assess the quality of code search results and therefore ease the cost-benefit analysis they undertake trying to find suitable integration candidates. The proposed trustability metric incorporates both user votes and cross-project activity of developers to calculate a "karma" value for each developer. Through the karma value of all its developers a project is ranked on a trustability scale. We present JBENDER, a proof-of-concept code search engine which implements our trustability metric and we discuss preliminary results from an evaluation of the prototype.
Resumo:
Evaluation of a series of deaths due to a particular disease is a frequently requested task in occupational epidemiology. There are several techniques available to determine whether a series represents an occupational health problem. Each of these techniques, however, is subject to certain limitations including cost, applicability to a given situation, feasibility relative to available resources, or potential for bias. In light of these problems, a technique was developed to estimate the standardized mortality ratio at a greatly reduced cost. The technique is demonstrated by its application in the investigation of brain cancer among employees of a large chemical company. ^
Resumo:
Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.
Resumo:
This paper presents a neural network based technique for the classification of segments of road images into cracks and normal images. The density and histogram features are extracted. The features are passed to a neural network for the classification of images into images with and without cracks. Once images are classified into cracks and non-cracks, they are passed to another neural network for the classification of a crack type after segmentation. Some experiments were conducted and promising results were obtained. The selected results and a comparative analysis are included in this paper.
Resumo:
The paper has been presented at the International Conference Pioneers of Bulgarian Mathematics, Dedicated to Nikola Obreshko and Lubomir Tschakalo , So a, July, 2006.
Resumo:
Reverse engineering is usually the stepping stone of a variety of at-tacks aiming at identifying sensitive information (keys, credentials, data, algo-rithms) or vulnerabilities and flaws for broader exploitation. Software applica-tions are usually deployed as identical binary code installed on millions of com-puters, enabling an adversary to develop a generic reverse-engineering strategy that, if working on one code instance, could be applied to crack all the other in-stances. A solution to mitigate this problem is represented by Software Diversity, which aims at creating several structurally different (but functionally equivalent) binary code versions out of the same source code, so that even if a successful attack can be elaborated for one version, it should not work on a diversified ver-sion. In this paper, we address the problem of maximizing software diversity from a search-based optimization point of view. The program to protect is subject to a catalogue of transformations to generate many candidate versions. The problem of selecting the subset of most diversified versions to be deployed is formulated as an optimisation problem, that we tackle with different search heuristics. We show the applicability of this approach on some popular Android apps.
Resumo:
This article presents a back-electromotive force (BEMF)-based technique of detection for sensorless brushless direct current motor (BLDCM) drivers. The BLDCM has been chosen as the energy converter in rotary or pulsatile blood pumps that use electrical motors for pumping. However, in order to operate properly, the BLDCM driver needs to know the shaft position. Usually, that information is obtained through a set of Hall sensors assembled close to the rotor and connected to the electronic controller by wires. Sometimes, a large distance between the motor and controller makes the system susceptible to interference on the sensor signal because of winding current switching. Thus, the goal of the sensorless technique presented in this study is to avoid this problem. First, the operation of BLDCM was evaluated on the electronic simulator PSpice. Then, a BEMF detector circuitry was assembled in our laboratories. For the tests, a sensor-dependent system was assembled where the direct comparison between the Hall sensors signals and the detected signals was performed. The obtained results showed that the output sensorless detector signals are very similar to the Hall signals at speeds of more than 2500 rpm. Therefore, the sensorless technique is recommended as a responsible or redundant system to be used in rotary blood pumps.
Resumo:
The purpose of this work is to present an algorithm to solve nonlinear constrained optimization problems, using the filter method with the inexact restoration (IR) approach. In the IR approach two independent phases are performed in each iteration—the feasibility and the optimality phases. The first one directs the iterative process into the feasible region, i.e. finds one point with less constraints violation. The optimality phase starts from this point and its goal is to optimize the objective function into the satisfied constraints space. To evaluate the solution approximations in each iteration a scheme based on the filter method is used in both phases of the algorithm. This method replaces the merit functions that are based on penalty schemes, avoiding the related difficulties such as the penalty parameter estimation and the non-differentiability of some of them. The filter method is implemented in the context of the line search globalization technique. A set of more than two hundred AMPL test problems is solved. The algorithm developed is compared with LOQO and NPSOL software packages.
Resumo:
Historical renders are exposed to several degradation processes that can lead to a wide range of anomalies,such as scaling, detachments, and pulverization. Among the common anomalies, the loss of cohesion and of adhesion are usually identified as the most difficult to repair; these anomalies still need to be deeply studied to design compatible, durable, and sustainable conservation treatments. The restitution of render cohesion can be achieved using consolidating products. Nevertheless, repair treatments could induce aesthetic alterations, and, therefore, are usually followed by chromatic reintegration. This work aims to study the effectiveness of mineral products as consolidants for lime-based mortars and simultaneously as chromatic treatments for pigmented renders. The studied consolidating products are prepared by mixing air lime,metakaolin, water, and mineral pigments. The idea of these consolidating and coloring products rises from a traditional lime-based technique, the limewash, widely diffused in southern Europe and in the Mediterranean area. Consolidating products were applied and tested on lime-based mortar specimens with a low binder–aggregate ratio and therefore with reduced cohesion. A physico-mechanical, microstructural, and mineralogical characterization was performed on untreated and treated specimens, in order to evaluate the efficacy and durability of the treatments. Accelerated aging tests were also performed to assess consolidant durability, when subjected to aggressive conditions. Results showed that the consolidants tested are compatible, effective, and possess good durability.