30 resultados para optimization-based similarity reasoning
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Biogeography is the science that studies the geographical distribution and the migration of species in an ecosystem. Biogeography-based optimization (BBO) is a recently developed global optimization algorithm as a generalization of biogeography to evolutionary algorithm and has shown its ability to solve complex optimization problems. BBO employs a migration operator to share information between the problem solutions. The problem solutions are identified as habitat, and the sharing of features is called migration. In this paper, a multiobjective BBO, combined with a predator-prey (PPBBO) approach, is proposed and validated in the constrained design of a brushless dc wheel motor. The results demonstrated that the proposed PPBBO approach converged to promising solutions in terms of quality and dominance when compared with the classical BBO in a multiobjective version.
Resumo:
Piezoresistive sensors are commonly made of a piezoresistive membrane attached to a flexible substrate, a plate. They have been widely studied and used in several applications. It has been found that the size, position and geometry of the piezoresistive membrane may affect the performance of the sensors. Based on this remark, in this work, a topology optimization methodology for the design of piezoresistive plate-based sensors, for which both the piezoresistive membrane and the flexible substrate disposition can be optimized, is evaluated. Perfect coupling conditions between the substrate and the membrane based on the `layerwise' theory for laminated plates, and a material model for the piezoresistive membrane based on the solid isotropic material with penalization model, are employed. The design goal is to obtain the configuration of material that maximizes the sensor sensitivity to external loading, as well as the stiffness of the sensor to particular loads, which depend on the case (application) studied. The proposed approach is evaluated by studying two distinct examples: the optimization of an atomic force microscope probe and a pressure sensor. The results suggest that the performance of the sensors can be improved by using the proposed approach.
Resumo:
In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper aims to provide an improved NSGA-II (Non-Dominated Sorting Genetic Algorithm-version II) which incorporates a parameter-free self-tuning approach by reinforcement learning technique, called Non-Dominated Sorting Genetic Algorithm Based on Reinforcement Learning (NSGA-RL). The proposed method is particularly compared with the classical NSGA-II when applied to a satellite coverage problem. Furthermore, not only the optimization results are compared with results obtained by other multiobjective optimization methods, but also guarantee the advantage of no time-spending and complex parameter tuning.
Resumo:
The influence of glycerol concentration (C-g), process temperature (T-p), drying temperature (T-s), and relative humidity (RH) on the properties of achira flour films was initially assessed. The optimized process conditions were C-g of 17g glycerol/100g flour, T-p of 90 degrees C, T-s of 44.8 degrees C, and RH of 36.4%. The films produced under these conditions displayed high mechanical strength (7.0 MPa), low solubility (38.3%). and satisfactory elongation values (14.6%). This study showed that achira flour is a promising source for the development of biodegradable films with good mechanical properties, low water vapor permeability, and solubility compared to films based on other tubers. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
In deterministic optimization, the uncertainties of the structural system (i.e. dimension, model, material, loads, etc) are not explicitly taken into account. Hence, resulting optimal solutions may lead to reduced reliability levels. The objective of reliability based design optimization (RBDO) is to optimize structures guaranteeing that a minimum level of reliability, chosen a priori by the designer, is maintained. Since reliability analysis using the First Order Reliability Method (FORM) is an optimization procedure itself, RBDO (in its classical version) is a double-loop strategy: the reliability analysis (inner loop) and the structural optimization (outer loop). The coupling of these two loops leads to very high computational costs. To reduce the computational burden of RBDO based on FORM, several authors propose decoupling the structural optimization and the reliability analysis. These procedures may be divided in two groups: (i) serial single loop methods and (ii) unilevel methods. The basic idea of serial single loop methods is to decouple the two loops and solve them sequentially, until some convergence criterion is achieved. On the other hand, uni-level methods employ different strategies to obtain a single loop of optimization to solve the RBDO problem. This paper presents a review of such RBDO strategies. A comparison of the performance (computational cost) of the main strategies is presented for several variants of two benchmark problems from the literature and for a structure modeled using the finite element method.
Resumo:
Combining data from multiple analytical platforms is essential for comprehensive study of the molecular phenotype (metabotype) of a given biological sample. The metabolite profiles generated are intrinsically dependent on the analytical platforms, each requiring optimization of instrumental parameters, separation conditions, and sample extraction to deliver maximal biological information. An in-depth evaluation of extraction protocols for characterizing the metabolome of the hepatobiliary fluke Fasciola hepatica, using ultra performance liquid chromatography and capillary electrophoresis coupled with mass spectroscopy is presented. The spectrometric methods were characterized by performance, and metrics of merit were established, including precision, mass accuracy, selectivity, sensitivity, and platform stability. Although a core group of molecules was common to all methods, each platform contributed a unique set, whereby 142 metabolites out of 14,724 features were identified. A mixture design revealed that the chloroform:methanol:water proportion of 15:59:26 was globally the best composition for metabolite extraction across UPLC-MS and CE-MS platforms accommodating different columns and ionization modes. Despite the general assumption of the necessity of platform-adapted protocols for achieving effective metabotype characterization, we show that an appropriately designed single extraction procedure is able to fit the requirements of all technologies. This may constitute a paradigm shift in developing efficient protocols for high-throughput metabolite profiling with more-general analytical applicability.
Resumo:
Trypanothione reductase has long been investigated as a promising target for chemotherapeutic intervention in Chagas disease, since it is an enzyme of a unique metabolic pathway that is exclusively present in the pathogen but not in the human host, which has the analog Glutathione reductase. In spite of the present data-set includes a small number of compounds, a combined use of flexible docking, pharmacophore perception, ligand binding site prediction, and Grid-Independent Descriptors GRIND2-based 3D-Quantitative Structure-Activity Relationships (QSAR) procedures allowed us to rationalize the different biological activities of a series of 11 aryl beta-aminocarbonyl derivatives, which are inhibitors of Trypanosoma cruzi trypanothione reductase (TcTR). Three QSAR models were built and validated using different alignments, which are based on docking with the TcTR crystal structure, pharmacophore, and molecular interaction fields. The high statistical significance of the models thus obtained assures the robustness of this second generation of GRIND descriptors here used, which were able to detect the most important residues of such enzyme for binding the aryl beta-aminocarbonyl derivatives, besides to rationalize distances among them. Finally, a revised binding mode has been proposed for our inhibitors and independently supported by the different methodologies here used, allowing further optimization of the lead compounds with such combined structure- and ligand-based approaches in the fight against the Chagas disease.
Resumo:
The classification of texts has become a major endeavor with so much electronic material available, for it is an essential task in several applications, including search engines and information retrieval. There are different ways to define similarity for grouping similar texts into clusters, as the concept of similarity may depend on the purpose of the task. For instance, in topic extraction similar texts mean those within the same semantic field, whereas in author recognition stylistic features should be considered. In this study, we introduce ways to classify texts employing concepts of complex networks, which may be able to capture syntactic, semantic and even pragmatic features. The interplay between various metrics of the complex networks is analyzed with three applications, namely identification of machine translation (MT) systems, evaluation of quality of machine translated texts and authorship recognition. We shall show that topological features of the networks representing texts can enhance the ability to identify MT systems in particular cases. For evaluating the quality of MT texts, on the other hand, high correlation was obtained with methods capable of capturing the semantics. This was expected because the golden standards used are themselves based on word co-occurrence. Notwithstanding, the Katz similarity, which involves semantic and structure in the comparison of texts, achieved the highest correlation with the NIST measurement, indicating that in some cases the combination of both approaches can improve the ability to quantify quality in MT. In authorship recognition, again the topological features were relevant in some contexts, though for the books and authors analyzed good results were obtained with semantic features as well. Because hybrid approaches encompassing semantic and topological features have not been extensively used, we believe that the methodology proposed here may be useful to enhance text classification considerably, as it combines well-established strategies. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
Piezoelectric materials can be used to convert oscillatory mechanical energy into electrical energy. Energy harvesting devices are designed to capture the ambient energy surrounding the electronics and convert it into usable electrical energy. The design of energy harvesting devices is not obvious, requiring optimization procedures. This paper investigates the influence of pattern gradation using topology optimization on the design of piezocomposite energy harvesting devices based on bending behavior. The objective function consists of maximizing the electric power generated in a load resistor. A projection scheme is employed to compute the element densities from design variables and control the length scale of the material density. Examples of two-dimensional piezocomposite energy harvesting devices are presented and discussed using the proposed method. The numerical results illustrate that pattern gradation constraints help to increase the electric power generated in a load resistor and guides the problem toward a more stable solution. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The wide variety of molecular architectures used in sensors and biosensors and the large amount of data generated with some principles of detection have motivated the use of computational methods, such as information visualization techniques, not only to handle the data but also to optimize sensing performance. In this study, we combine projection techniques with micro-Raman scattering and atomic force microscopy (AFM) to address critical issues related to practical applications of electronic tongues (e-tongues) based on impedance spectroscopy. Experimentally, we used sensing units made with thin films of a perylene derivative (AzoPTCD acronym), coating Pt interdigitated electrodes, to detect CuCl(2) (Cu(2+)), methylene blue (MB), and saccharose in aqueous solutions, which were selected due to their distinct molecular sizes and ionic character in solution. The AzoPTCD films were deposited from monolayers to 120 nm via Langmuir-Blodgett (LB) and physical vapor deposition (PVD) techniques. Because the main aspects investigated were how the interdigitated electrodes are coated by thin films (architecture on e-tongue) and the film thickness, we decided to employ the same material for all sensing units. The capacitance data were projected into a 2D plot using the force scheme method, from which we could infer that at low analyte concentrations the electrical response of the units was determined by the film thickness. Concentrations at 10 mu M or higher could be distinguished with thinner films tens of nanometers at most-which could withstand the impedance measurements, and without causing significant changes in the Raman signal for the AzoPTCD film-forming molecules. The sensitivity to the analytes appears to be related to adsorption on the film surface, as inferred from Raman spectroscopy data using MB as analyte and from the multidimensional projections. The analysis of the results presented may serve as a new route to select materials and molecular architectures for novel sensors and biosensors, in addition to suggesting ways to unravel the mechanisms behind the high sensitivity obtained in various sensors.
Resumo:
XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.
Resumo:
Gelatin-based films containing both Yucca schidigera extract and low concentrations of glycerol (0.25-8.75 g per 100 g protein) were produced by extrusion (EF) and characterized in relation to their mechanical properties and moisture content. The formulations that resulted in either larger or smaller elongation values were used to produce films via both blown extrusion (EBF) and casting (CF) and were characterized with respect to their mechanical properties, water vapor permeability, moisture content, solubility, morphology and infrared spectroscopy. The elongation of the EF films was significantly higher than that of the CF and EBF films. The transversal section possessed a compact, homogeneous structure for all of the films studied. The solubility of the films (36-40%) did not differ significantly between the different processes evaluated. The EBF films demonstrated lower water vapor permeability (0.12 g mm m-(2) h(-1) kPa(-1)) than the CF and EF films. The infrared spectra did not indicate any strong interactions between the added compounds. Thermoplastic processing of the gelatin films can significantly increase their elongation; however, a more detailed assessment and optimization of the extrusion conditions is necessary, along with the addition of partially hydrophobic compounds, such as surfactants. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Ultrasonography has an inherent noise pattern, called speckle, which is known to hamper object recognition for both humans and computers. Speckle noise is produced by the mutual interference of a set of scattered wavefronts. Depending on the phase of the wavefronts, the interference may be constructive or destructive, which results in brighter or darker pixels, respectively. We propose a filter that minimizes noise fluctuation while simultaneously preserving local gray level information. It is based on steps to attenuate the destructive and constructive interference present in ultrasound images. This filter, called interference-based speckle filter followed by anisotropic diffusion (ISFAD), was developed to remove speckle texture from B-mode ultrasound images, while preserving the edges and the gray level of the region. The ISFAD performance was compared with 10 other filters. The evaluation was based on their application to images simulated by Field II (developed by Jensen et al.) and the proposed filter presented the greatest structural similarity, 0.95. Functional improvement of the segmentation task was also measured, comparing rates of true positive, false positive and accuracy. Using three different segmentation techniques, ISFAD also presented the best accuracy rate (greater than 90% for structures with well-defined borders). (E-mail: fernando.okara@gmail.com) (C) 2012 World Federation for Ultrasound in Medicine & Biology.
Resumo:
Traditional supervised data classification considers only physical features (e. g., distance or similarity) of the input data. Here, this type of learning is called low level classification. On the other hand, the human (animal) brain performs both low and high orders of learning and it has facility in identifying patterns according to the semantic meaning of the input data. Data classification that considers not only physical attributes but also the pattern formation is, here, referred to as high level classification. In this paper, we propose a hybrid classification technique that combines both types of learning. The low level term can be implemented by any classification technique, while the high level term is realized by the extraction of features of the underlying network constructed from the input data. Thus, the former classifies the test instances by their physical features or class topologies, while the latter measures the compliance of the test instances to the pattern formation of the data. Our study shows that the proposed technique not only can realize classification according to the pattern formation, but also is able to improve the performance of traditional classification techniques. Furthermore, as the class configuration's complexity increases, such as the mixture among different classes, a larger portion of the high level term is required to get correct classification. This feature confirms that the high level classification has a special importance in complex situations of classification. Finally, we show how the proposed technique can be employed in a real-world application, where it is capable of identifying variations and distortions of handwritten digit images. As a result, it supplies an improvement in the overall pattern recognition rate.