877 resultados para Multi-objective evolutionary algorithm
Resumo:
Tese de Doutoramento em Engenharia Industrial e de Sistemas.
Resumo:
El objetivo general de este proyecto es desarrollar nuevos modelos multi-dominio de máquinas eléctricas para aplicaciones al control y al diagnóstico de fallas. Se propone comenzar con el modelo electromagnético del motor de inducción en base a circuitos magnéticos equivalentes (MEC) validándolo por medio de simulación y de resultados experimentales. Como segundo paso se pretende desarrollas modelos térmicos y mecánicos con el objetivo que puedan ser acoplados al modelo electromagnético y de esta estudiar la interacción de los dominios y se validará mediante resultados de simulación y experimentales el modelo completo. Finalmente se pretende utilizar el modelo multi-dominio como una herramienta para la prueba de nuevas estrategias de control y diagnóstico de fallas. The main objective of this project is the development of new multi-domain models of electric machines for control and fault diagnosis applications. The electromagnetic modeling of the induction motor (IM) will be done using the magnetic equivalent circuits approach. This model will be validated by simulation and by experimental results. As a second step of this project, new mechanical and thermal models for the IM will be developed, with the objective of coupling these models with the electromagnetic one. With this multi-domain model it will be possible to study the interaction between each others. After that, the complete model will be validated by simulation and experimental results. Finally, the model will be used as a tool for testing new control and fault diagnosis strategies.
Resumo:
El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme
Resumo:
Background:Vascular remodeling, the dynamic dimensional change in face of stress, can assume different directions as well as magnitudes in atherosclerotic disease. Classical measurements rely on reference to segments at a distance, risking inappropriate comparison between dislike vessel portions.Objective:to explore a new method for quantifying vessel remodeling, based on the comparison between a given target segment and its inferred normal dimensions.Methods:Geometric parameters and plaque composition were determined in 67 patients using three-vessel intravascular ultrasound with virtual histology (IVUS-VH). Coronary vessel remodeling at cross-section (n = 27.639) and lesion (n = 618) levels was assessed using classical metrics and a novel analytic algorithm based on the fractional vessel remodeling index (FVRI), which quantifies the total change in arterial wall dimensions related to the estimated normal dimension of the vessel. A prediction model was built to estimate the normal dimension of the vessel for calculation of FVRI.Results:According to the new algorithm, “Ectatic” remodeling pattern was least common, “Complete compensatory” remodeling was present in approximately half of the instances, and “Negative” and “Incomplete compensatory” remodeling types were detected in the remaining. Compared to a traditional diagnostic scheme, FVRI-based classification seemed to better discriminate plaque composition by IVUS-VH.Conclusion:Quantitative assessment of coronary remodeling using target segment dimensions offers a promising approach to evaluate the vessel response to plaque growth/regression.
Resumo:
The main argument developed here is the proposal of the concept of “Social Multi-Criteria Evaluation” (SMCE) as a possible useful framework for the application of social choice to the difficult policy problems of our Millennium, where, as stated by Funtowicz and Ravetz, “facts are uncertain, values in dispute, stakes high and decisions urgent”. This paper starts from the following main questions: 1. Why “Social” Multi-criteria Evaluation? 2. How such an approach should be developed? The foundations of SMCE are set up by referring to concepts coming from complex system theory and philosophy, such as reflexive complexity, post-normal science and incommensurability. To give some operational guidelines on the application of SMCE basic questions to be answered are: 1. How is it possible to deal with technical incommensurability? 2. How can we deal with the issue of social incommensurability? To answer these questions, by using theoretical considerations and lessons learned from realworld case studies, is the main objective of the present article.
Resumo:
The objective of this work was to develop an easily applicable technique and a standardized protocol for high-quality post-mortem angiography. This protocol should (1) increase the radiological interpretation by decreasing artifacts due to the perfusion and by reaching a complete filling of the vascular system and (2) ease and standardize the execution of the examination. To this aim, 45 human corpses were investigated by post-mortem computed tomography (CT) angiography using different perfusion protocols, a modified heart-lung machine and a new contrast agent mixture, specifically developed for post-mortem investigations. The quality of the CT angiographies was evaluated radiologically by observing the filling of the vascular system and assessing the interpretability of the resulting images and by comparing radiological diagnoses to conventional autopsy conclusions. Post-mortem angiography yielded satisfactory results provided that the volumes of the injected contrast agent mixture were high enough to completely fill the vascular system. In order to avoid artifacts due to the post-mortem perfusion, a minimum of three angiographic phases and one native scan had to be performed. These findings were taken into account to develop a protocol for quality post-mortem CT angiography that minimizes the risk of radiological misinterpretation. The proposed protocol is easy applicable in a standardized way and yields high-quality radiologically interpretable visualization of the vascular system in post-mortem investigations.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
Introduction: Coordination is a strategy chosen by the central nervous system to control the movements and maintain stability during gait. Coordinated multi-joint movements require a complex interaction between nervous outputs, biomechanical constraints, and pro-prioception. Quantitatively understanding and modeling gait coordination still remain a challenge. Surgeons lack a way to model and appreciate the coordination of patients before and after surgery of the lower limbs. Patients alter their gait patterns and their kinematic synergies when they walk faster or slower than normal speed to maintain their stability and minimize the energy cost of locomotion. The goal of this study was to provide a dynamical system approach to quantitatively describe human gait coordination and apply it to patients before and after total knee arthroplasty. Methods: A new method of quantitative analysis of interjoint coordination during gait was designed, providing a general model to capture the whole dynamics and showing the kinematic synergies at various walking speeds. The proposed model imposed a relationship among lower limb joint angles (hips and knees) to parameterize the dynamics of locomotion of each individual. An integration of different analysis tools such as Harmonic analysis, Principal Component Analysis, and Artificial Neural Network helped overcome high-dimensionality, temporal dependence, and non-linear relationships of the gait patterns. Ten patients were studied using an ambulatory gait device (Physilog®). Each participant was asked to perform two walking trials of 30m long at 3 different speeds and to complete an EQ-5D questionnaire, a WOMAC and Knee Society Score. Lower limbs rotations were measured by four miniature angular rate sensors mounted respectively, on each shank and thigh. The outcomes of the eight patients undergoing total knee arthroplasty, recorded pre-operatively and post-operatively at 6 weeks, 3 months, 6 months and 1 year were compared to 2 age-matched healthy subjects. Results: The new method provided coordination scores at various walking speeds, ranged between 0 and 10. It determined the overall coordination of the lower limbs as well as the contribution of each joint to the total coordination. The difference between the pre-operative and post-operative coordination values were correlated with the improvements of the subjective outcome scores. Although the study group was small, the results showed a new way to objectively quantify gait coordination of patients undergoing total knee arthroplasty, using only portable body-fixed sensors. Conclusion: A new method for objective gait coordination analysis has been developed with very encouraging results regarding the objective outcome of lower limb surgery.
Resumo:
The nematode parasite Ascaris lumbricoides infects the digestive tracts of over 1.4 billion people worldwide, and its sister species, Ascaris suum, has infected a countless number of domesticated and feral pigs. It is generally thought that the putative ancestor to these worms infected either humans or pigs, but with the advent of domestication, they had ample opportunity to jump to a new host and subsequently specialize and evolve into a new species. While nuclear DNA markers decisively separate the two populations, mitochondrial sequences reveal that three major haplotypes are found in A. suum and in A. lumbricoides, indicating either occasional hybridization, causing introgression of gene trees, or retention of polymorphism dating back to the original ancestral species. This article provides an illustration of the combined contribution of parasitology, archaeoparasitology, genetics and paleogenetics to the history of ascariasis. We specifically investigate the molecular history of ascariasis in humans by sequencing DNA from the eggs of Ascaris found among ancient archeological remains. The findings of this paleogenetic survey will explain whether the three mitochondrial haplotypes result from recent hybridization and introgression, due to intensive human-pig interaction, or whether their co-occurrence predates pig husbandry, perhaps dating back to the common ancestor. We hope to show how human-pig interaction has shaped the recent evolutionary history of this disease, perhaps revealing the identity of the ancestral host.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.
Resumo:
OBJECTIVE: Mild neurocognitive disorders (MND) affect a subset of HIV+ patients under effective combination antiretroviral therapy (cART). In this study, we used an innovative multi-contrast magnetic resonance imaging (MRI) approach at high-field to assess the presence of micro-structural brain alterations in MND+ patients. METHODS: We enrolled 17 MND+ and 19 MND- patients with undetectable HIV-1 RNA and 19 healthy controls (HC). MRI acquisitions at 3T included: MP2RAGE for T1 relaxation times, Magnetization Transfer (MT), T2* and Susceptibility Weighted Imaging (SWI) to probe micro-structural integrity and iron deposition in the brain. Statistical analysis used permutation-based tests and correction for family-wise error rate. Multiple regression analysis was performed between MRI data and (i) neuropsychological results (ii) HIV infection characteristics. A linear discriminant analysis (LDA) based on MRI data was performed between MND+ and MND- patients and cross-validated with a leave-one-out test. RESULTS: Our data revealed loss of structural integrity and micro-oedema in MND+ compared to HC in the global white and cortical gray matter, as well as in the thalamus and basal ganglia. Multiple regression analysis showed a significant influence of sub-cortical nuclei alterations on the executive index of MND+ patients (p = 0.04 he and R(2) = 95.2). The LDA distinguished MND+ and MND- patients with a classification quality of 73% after cross-validation. CONCLUSION: Our study shows micro-structural brain tissue alterations in MND+ patients under effective therapy and suggests that multi-contrast MRI at high field is a powerful approach to discriminate between HIV+ patients on cART with and without mild neurocognitive deficits.
Resumo:
In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio
Resumo:
HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.
Resumo:
BACKGROUND The objective of this research was to evaluate data from a randomized clinical trial that tested injectable diacetylmorphine (DAM) and oral methadone (MMT) for substitution treatment, using a multi-domain dichotomous index, with a Bayesian approach. METHODS Sixty two long-term, socially-excluded heroin injectors, not benefiting from available treatments were randomized to receive either DAM or MMT for 9 months in Granada, Spain. Completers were 44 and data at the end of the study period was obtained for 50. Participants were determined to be responders or non responders using a multi-domain outcome index accounting for their physical and mental health and psychosocial integration, used in a previous trial. Data was analyzed with Bayesian methods, using information from a similar study conducted in The Netherlands to select a priori distributions. On adding the data from the present study to update the a priori information, the distribution of the difference in response rates were obtained and used to build credibility intervals and relevant probability computations. RESULTS In the experimental group (n = 27), the rate of responders to treatment was 70.4% (95% CI 53.287.6), and in the control group (n = 23), it was 34.8% (95% CI 15.354.3). The probability of success in the experimental group using the a posteriori distributions was higher after a proper sensitivity analysis. Almost the whole distribution of the rates difference (the one for diacetylmorphine minus methadone) was located to the right of the zero, indicating the superiority of the experimental treatment. CONCLUSION The present analysis suggests a clinical superiority of injectable diacetylmorphine compared to oral methadone in the treatment of severely affected heroin injectors not benefiting sufficiently from the available treatments. TRIAL REGISTRATION Current Controlled Trials ISRCTN52023186.
Resumo:
Social businesses present a new paradigm to capitalism, in which private companies, non-profit organizations and civil society create a new type of business with the main objective of solving social problems with financial sustainability and efficiency through market mechanisms. As any new phenomenon, different authors conceptualize social businesses with distinct views. This article aims to present and characterize three different perspectives of social business definitions: the European, the American and that of the emerging countries. Each one of these views was illustrated by a different Brazilian case. We conclude with the idea that all the cases have similar characteristics, but also relevant differences that are more than merely geographical. The perspectives analyzed in this paper provide an analytical framework for understanding the field of social businesses. Moreover, the cases demonstrate that in the Brazilian context the field of social business is under construction and that as such it draws on different conceptual influences to deal with a complex and challenging reality.