893 resultados para Feature Extraction Algorithms
Resumo:
Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Eletrónica Médica)
Resumo:
The chemical composition of propolis is affected by environmental factors and harvest season, making it difficult to standardize its extracts for medicinal usage. By detecting a typical chemical profile associated with propolis from a specific production region or season, certain types of propolis may be used to obtain a specific pharmacological activity. In this study, propolis from three agroecological regions (plain, plateau, and highlands) from southern Brazil, collected over the four seasons of 2010, were investigated through a novel NMR-based metabolomics data analysis workflow. Chemometrics and machine learning algorithms (PLS-DA and RF), including methods to estimate variable importance in classification, were used in this study. The machine learning and feature selection methods permitted construction of models for propolis sample classification with high accuracy (>75%, reaching 90% in the best case), better discriminating samples regarding their collection seasons comparatively to the harvest regions. PLS-DA and RF allowed the identification of biomarkers for sample discrimination, expanding the set of discriminating features and adding relevant information for the identification of the class-determining metabolites. The NMR-based metabolomics analytical platform, coupled to bioinformatic tools, allowed characterization and classification of Brazilian propolis samples regarding the metabolite signature of important compounds, i.e., chemical fingerprint, harvest seasons, and production regions.
Resumo:
Tese de Doutoramento (Programa Doutoral em Engenharia Biomédica)
Resumo:
PhD thesis in Biomedical Engineering
Resumo:
Distributed data aggregation is an important task, allowing the de- centralized determination of meaningful global properties, that can then be used to direct the execution of other applications. The resulting val- ues result from the distributed computation of functions like count, sum and average. Some application examples can found to determine the network size, total storage capacity, average load, majorities and many others. In the last decade, many di erent approaches have been pro- posed, with di erent trade-o s in terms of accuracy, reliability, message and time complexity. Due to the considerable amount and variety of ag- gregation algorithms, it can be di cult and time consuming to determine which techniques will be more appropriate to use in speci c settings, jus- tifying the existence of a survey to aid in this task. This work reviews the state of the art on distributed data aggregation algorithms, providing three main contributions. First, it formally de nes the concept of aggrega- tion, characterizing the di erent types of aggregation functions. Second, it succinctly describes the main aggregation techniques, organizing them in a taxonomy. Finally, it provides some guidelines toward the selection and use of the most relevant techniques, summarizing their principal characteristics.
Resumo:
Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.
Resumo:
[Excerpt] Cupuassu (Theobroma grandiflorum), tucumã (Astrocaryum aculeatum), peach palm (Bactris gasipaes) and abricó (American Mammea L.) are exotic fruits found in the Brazilian Amazon rainforest. All of them are well known by the native populations, and for centuries the pulps have been used in the production of juices, deserts, jams, syrups, and alcoholic beverages, among others. Additionally, the fruit seeds have been used as animal feed, fertilizers or to plant new seedlings, but a great part of these seeds are usually discarded. (...)
Resumo:
[Excerpt] Isolation and purification of valuable compounds are very important processes to valorize agro-food byproducts. Currently, protein extraction and development of environmentally friendly technologies are industrially relevant topics [1]. Among the extracted proteins from byproducts proteases are a relevant group for industrial applications. These enzymes are a class of hydrolytic enzymes capable of cleaving the peptide bonds of proteins chains and are essential in physiological processes [2]. (...)
Resumo:
In search to increase the offer of liquid, clean, renewable and sustainable energy in the world energy matrix, the use of lignocellulosic materials (LCMs) for bioethanol production arises as a valuable alternative. The objective of this work was to analyze and compare the performance of Saccharomyces cerevisiae, Pichia stipitis and Zymomonas mobilis in the production of bioethanol from coconut fibre mature (CFM) using different strategies: simultaneous saccharification and fermentation (SSF) and semi-simultaneous saccharification and fermentation (SSSF). The CFM was pretreated by hydrothermal pretreatment catalyzed with sodium hydroxide (HPCSH). The pretreated CFM was characterized by X-ray diffractometry and SEM, and the lignin recovered in the liquid phase by FTIR and TGA. After the HPCSH pretreatment (2.5% (v/v) sodium hydroxide at 180 °C for 30 min), the cellulose content was 56.44%, while the hemicellulose and lignin were reduced 69.04% and 89.13%, respectively. Following pretreatment, the obtained cellulosic fraction was submitted to SSF and SSSF. Pichia stipitis allowed for the highest ethanol yield 90.18% in SSSF, 91.17% and 91.03% were obtained with Saccharomyces cerevisiae and Zymomonas mobilis, respectively. It may be concluded that the selection of the most efficient microorganism for the obtention of high bioethanol production yields from cellulose pretreated by HPCSH depends on the operational strategy used and this pretreatment is an interesting alternative for add value of coconut fibre mature compounds (lignin, phenolics) being in accordance with the biorefinery concept.
Resumo:
Polysaccharides and oligosaccharides can improve quality and enhance nutritional value of final food products due to their technological and nutritional features ranging from their capacity to improve texture to their effect as dietary fibers. For this reason, they are among the most studied ingredients in the food industry. The use of natural polysaccharides and oligosaccharides as food additives has been a reality since the food industry understood their potential technological and nutritional applications. Currently, the replacement of traditional ingredients and/or the synergy between traditional ingredients and polysaccharides and oligosaccharides are perceived as promising approaches by the food industry. Traditionally, polysaccharides have been used as thickening, emulsifying, and stabilizing agents, however, at this moment polysaccharides and oligosaccharides claim health and nutritional advantages, thus opening a new market of nutritional and functional foods. Indeed, their use as nutritional food ingredients enabled the food industry to develop a countless number of applications, e.g., fat replacers, prebiotics, dietary fiber, and antiulcer agents. Based on this, among the scientific community and food industry, in the last years many research studies and commercial products showed the possibility of using either new or already used sources (though with changed properties) of polysaccharides for the production of food additives with new and enhanced properties. The increasing interest in such products is clearly illustrated by the market figures and consumption trends. As an example, the sole market of hydrocolloids is estimated to reach $7 billion in 2018. Moreover, oligosaccharides can be found in more than 500 food products resulting in a significant daily consumption. A recent study from the Transparency Market Research on Prebiotic Ingredients Market reported that prebiotics' demand was worth $2.3 billion in 2012 and it is estimated to reach $4.5 billion in 2018, growing at a compound annual growth rate of 11.4% between 2012 and 2018. The entrance of this new generation of food additives in the market, often claiming health and nutritional benefits, imposes an impartial analysis by the legal authorities regarding the accomplishment of requirements that have been established for introducing novel ingredients/food, including new poly- and oligosaccharides. This chapter deals with the potential use of polysaccharides and oligosaccharides as food additives, as well as alternative sources of these compounds and their possible applications in food products. Moreover, the regulation process to introduce novel polysaccharides and oligosaccharides in the market as food additives and to assign them health claims is discussed.
Resumo:
OBJECTIVE: To analyze the results of laser-assisted extraction of permanent pacemaker and defibrillator leads. METHODS: We operated upon 36 patients, whose mean age was 54.2 years, and extracted 56 leads. The reasons for extracting the leads were as follows: infection in 19 patients, elective replacement in 13, and other causes in 4 patients. The mean time of catheter placement was 7.5±5.5 years. Forty-seven leads were from pacemakers, and 9 were from defibrillators. Thirty-eight leads were in use, 14 had been abandoned in the pacemaker pocket, and 4 had been abandoned inside the venous system. RESULTS: We successfully extracted 54 catheters, obtaining a 96.4% rate of success and an 82.1% rate for complete extraction. The 2 unsuccessful cases were due to the presence of calcium in the trajectory of the lead. The mean duration of laser light application was 123.0±104.5 s, using 5,215.2±4,924.0 pulses, in a total of 24.4±24.2 cycles of application. Thirty-four leads were extracted from the myocardium with countertraction after complete progression of the laser sheath, 12 leads came loose during the progression of the laser sheath, and the remaining 10 were extracted with other maneuvers. One patient experienced cardiac tamponade after extraction of the defibrillator lead, requiring open emergency surgery. CONCLUSION: The use of the excimer laser allowed extraction of the leads with a 96% rate of success; it was not effective in 2 patients who had calcification on the lead. One patient (2.8%) had a complication that required cardiac surgery on an emergency basis.
Resumo:
En este proyecto se desarrollarán algoritmos numéricos para sistemas no lineales hiperbólicos-parabólicos de ecuaciones diferenciales en derivadas parciales. Dichos sistemas tienen aplicación en propagación de ondas en ámbitos aeroespaciales y astrofísicos.Objetivos generales: 1)Desarrollo y mejora de algoritmos numéricos con la finalidad de incrementar la calidad en la simulación de propagación e interacción de ondas gasdinámicas y magnetogasdinámicas no lineales. 2)Desarrollo de códigos computacionales con la finalidad de simular flujos gasdinámicos de elevada entalpía incluyendo cambios químicos, efectos dispersivos y difusivos.3)Desarrollo de códigos computacionales con la finalidad de simular flujos magnetogasdinámicos ideales y reales.4)Aplicación de los nuevos algoritmos y códigos computacionales a la solución del flujo aerotermodinámico alrededor de cuerpos que ingresan en la atmósfera terrestre. 5)Aplicación de los nuevos algoritmos y códigos computacionales a la simulación del comportamiento dinámico no lineal de arcos magnéticos en la corona solar. 6)Desarrollo de nuevos modelos para describir el comportamiento no lineal de arcos magnéticos en la corona solar.Este proyecto presenta como objetivo principal la introducción de mejoras en algoritmos numéricos para simular la propagación e interacción de ondas no lineales en dos medios gaseosos: aquellos que no poseen carga eléctrica libre (flujos gasdinámicos) y aquellos que tienen carga eléctrica libre (flujos magnetogasdinámicos). Al mismo tiempo se desarrollarán códigos computacionales que implementen las mejoras de las técnicas numéricas.Los algoritmos numéricos se aplicarán con la finalidad de incrementar el conocimiento en tópicos de interés en la ingeniería aeroespacial como es el cálculo del flujo de calor y fuerzas aerotermodinámicas que soportan objetos que ingresan a la atmósfera terrestre y en temas de astrofísica como la propagación e interacción de ondas, tanto para la transferencia de energía como para la generación de inestabilidades en arcos magnéticos de la corona solar. Estos dos temas poseen en común las técnicas y algoritmos numéricos con los que serán tratados. Las ecuaciones gasdinámicas y magnetogasdinámicas ideales conforman sistemas hiperbólicos de ecuaciones diferenciales y pueden ser solucionados utilizando "Riemann solvers" junto con el método de volúmenes finitos (Toro 1999; Udrea 1999; LeVeque 1992 y 2005). La inclusión de efectos difusivos genera que los sistemas de ecuaciones resulten hiperbólicos-parabólicos. La contribución parabólica puede ser considerada como términos fuentes y tratada adicionalmente tanto en forma explícita como implícita (Udrea 1999; LeVeque 2005).Para analizar el flujo alrededor de cuerpos que ingresan en la atmósfera se utilizarán las ecuaciones de Navier-Stokes químicamente activas, mientras la temperatura no supere los 6000K. Para mayores temperaturas es necesario considerar efectos de ionización (Anderson, 1989). Tanto los efectos difusivos como los cambios químicos serán considerados como términos fuentes en las ecuaciones de Euler. Para tratar la propagación de ondas, transferencia de energía e inestabilidades en arcos magnéticos de la corona solar se utilizarán las ecuaciones de la magnetogasdinámica ideal y real. En este caso será también conveniente implementar términos fuente para el tratamiento de fenómenos de transporte como el flujo de calor y el de radiación. Los códigos utilizarán la técnica de volúmenes finitos, junto con esquemas "Total Variation Disminishing - TVD" sobre mallas estructuradas y no estructuradas.
Resumo:
En nuestro proyecto anterior aproximamos el cálculo de una integral definida con integrandos de grandes variaciones funcionales. Nuestra aproximación paraleliza el algoritmo de cómputo de un método adaptivo de cuadratura, basado en reglas de Newton-Cote. Los primeros resultados obtenidos fueron comunicados en distintos congresos nacionales e internacionales; ellos nos permintieron comenzar con una tipificación de las reglas de cuadratura existentes y una clasificación de algunas funciones utilizadas como funciones de prueba. Estas tareas de clasificación y tipificación no las hemos finalizado, por lo que pretendemos darle continuidad a fin de poder informar sobre la conveniencia o no de utilizar nuestra técnica. Para llevar adelante esta tarea se buscará una base de funciones de prueba y se ampliará el espectro de reglas de cuadraturas a utilizar. Además, nos proponemos re-estructurar el cálculo de algunas rutinas que intervienen en el cómputo de la mínima energía de una molécula. Este programa ya existe en su versión secuencial y está modelizado utilizando la aproximación LCAO. El mismo obtiene resultados exitosos en cuanto a precisión, comparado con otras publicaciones internacionales similares, pero requiere de un tiempo de cálculo significativamente alto. Nuestra propuesta es paralelizar el algoritmo mencionado abordándolo al menos en dos niveles: 1- decidir si conviene distribuir el cálculo de una integral entre varios procesadores o si será mejor distribuir distintas integrales entre diferentes procesadores. Debemos recordar que en los entornos de arquitecturas paralelas basadas en redes (típicamente redes de área local, LAN) el tiempo que ocupa el envío de mensajes entre los procesadores es muy significativo medido en cantidad de operaciones de cálculo que un procesador puede completar. 2- de ser necesario, paralelizar el cálculo de integrales dobles y/o triples. Para el desarrollo de nuestra propuesta se desarrollarán heurísticas para verificar y construir modelos en los casos mencionados tendientes a mejorar las rutinas de cálculo ya conocidas. A la vez que se testearán los algoritmos con casos de prueba. La metodología a utilizar es la habitual en Cálculo Numérico. Con cada propuesta se requiere: a) Implementar un algoritmo de cálculo tratando de lograr versiones superadoras de las ya existentes. b) Realizar los ejercicios de comparación con las rutinas existentes para confirmar o desechar una mejor perfomance numérica. c) Realizar estudios teóricos de error vinculados al método y a la implementación. Se conformó un equipo interdisciplinario integrado por investigadores tanto de Ciencias de la Computación como de Matemática. Metas a alcanzar Se espera obtener una caracterización de las reglas de cuadratura según su efectividad, con funciones de comportamiento oscilatorio y con decaimiento exponencial, y desarrollar implementaciones computacionales adecuadas, optimizadas y basadas en arquitecturas paralelas.
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.