855 resultados para Asynchronous iterative algorithms
Resumo:
The main objective of ventilation systems in case of fire is the reduction of the possible consequences by achieving the best possible conditions for the evacuation of the users and the intervention of the emergency services. The required immediate transition, from normal to emergency functioning of the ventilation equipments, is being strengthened by the use of automatic and semi-automatic control systems, what reduces the response times through the help to the operators, and the use of pre-defined strategies. A further step consists on the use of closed-loop algorithms, which takes into account not only the initial conditions but their development (air velocity, traffic situation, etc.), optimizing smoke control capacity.
Resumo:
This paper describes the design and evaluation of a new platform created in order to improve the learning experience of bilateral control algorithms in teleoperation. This experimental platform, developed at Universidad Politécnica de Madrid, is used by the students of the Master on Automation and Robotics in the practices of the subject called “Telerobotics and Teleoperation”. The main objective is to easily implement different control architectures in the developed platform and evaluate them under different conditions to better understand the main advantages and drawbacks of each control scheme. So, the student’s tasks are focused on adjusting the control parameters of the predefined controllers and designing new ones to analyze the changes in the behavior of the whole system. A description of the subject, main topics and the platform constructed are detailed in the paper. Furthermore, the methodology followed in the practices and the bilateral control algorithms are presented. Finally, the results obtained in the experiments with students are also shown.
Resumo:
Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document. Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document.
Resumo:
The design of nuclear power plant has to follow a number of regulations aimed at limiting the risks inherent in this type of installation. The goal is to prevent and to limit the consequences of any possible incident that might threaten the public or the environment. To verify that the safety requirements are met a safety assessment process is followed. Safety analysis is as key component of a safety assessment, which incorporates both probabilistic and deterministic approaches. The deterministic approach attempts to ensure that the various situations, and in particular accidents, that are considered to be plausible, have been taken into account, and that the monitoring systems and engineered safety and safeguard systems will be capable of ensuring the safety goals. On the other hand, probabilistic safety analysis tries to demonstrate that the safety requirements are met for potential accidents both within and beyond the design basis, thus identifying vulnerabilities not necessarily accessible through deterministic safety analysis alone. Probabilistic safety assessment (PSA) methodology is widely used in the nuclear industry and is especially effective in comprehensive assessment of the measures needed to prevent accidents with small probability but severe consequences. Still, the trend towards a risk informed regulation (RIR) demanded a more extended use of risk assessment techniques with a significant need to further extend PSA’s scope and quality. Here is where the theory of stimulated dynamics (TSD) intervenes, as it is the mathematical foundation of the integrated safety assessment (ISA) methodology developed by the CSN(Consejo de Seguridad Nuclear) branch of Modelling and Simulation (MOSI). Such methodology attempts to extend classical PSA including accident dynamic analysis, an assessment of the damage associated to the transients and a computation of the damage frequency. The application of this ISA methodology requires a computational framework called SCAIS (Simulation Code System for Integrated Safety Assessment). SCAIS provides accident dynamic analysis support through simulation of nuclear accident sequences and operating procedures. Furthermore, it includes probabilistic quantification of fault trees and sequences; and integration and statistic treatment of risk metrics. SCAIS comprehensively implies an intensive use of code coupling techniques to join typical thermal hydraulic analysis, severe accident and probability calculation codes. The integration of accident simulation in the risk assessment process and thus requiring the use of complex nuclear plant models is what makes it so powerful, yet at the cost of an enormous increase in complexity. As the complexity of the process is primarily focused on such accident simulation codes, the question of whether it is possible to reduce the number of required simulation arises, which will be the focus of the present work. This document presents the work done on the investigation of more efficient techniques applied to the process of risk assessment inside the mentioned ISA methodology. Therefore such techniques will have the primary goal of decreasing the number of simulation needed for an adequate estimation of the damage probability. As the methodology and tools are relatively recent, there is not much work done inside this line of investigation, making it a quite difficult but necessary task, and because of time limitations the scope of the work had to be reduced. Therefore, some assumptions were made to work in simplified scenarios best suited for an initial approximation to the problem. The following section tries to explain in detail the process followed to design and test the developed techniques. Then, the next section introduces the general concepts and formulae of the TSD theory which are at the core of the risk assessment process. Afterwards a description of the simulation framework requirements and design is given. Followed by an introduction to the developed techniques, giving full detail of its mathematical background and its procedures. Later, the test case used is described and result from the application of the techniques is shown. Finally the conclusions are presented and future lines of work are exposed.
Resumo:
En la interacción con el entorno que nos rodea durante nuestra vida diaria (utilizar un cepillo de dientes, abrir puertas, utilizar el teléfono móvil, etc.) y en situaciones profesionales (intervenciones médicas, procesos de producción, etc.), típicamente realizamos manipulaciones avanzadas que incluyen la utilización de los dedos de ambas manos. De esta forma el desarrollo de métodos de interacción háptica multi-dedo dan lugar a interfaces hombre-máquina más naturales y realistas. No obstante, la mayoría de interfaces hápticas disponibles en el mercado están basadas en interacciones con un solo punto de contacto; esto puede ser suficiente para la exploración o palpación del entorno pero no permite la realización de tareas más avanzadas como agarres. En esta tesis, se investiga el diseño mecánico, control y aplicaciones de dispositivos hápticos modulares con capacidad de reflexión de fuerzas en los dedos índice, corazón y pulgar del usuario. El diseño mecánico de la interfaz diseñada, ha sido optimizado con funciones multi-objetivo para conseguir una baja inercia, un amplio espacio de trabajo, alta manipulabilidad y reflexión de fuerzas superiores a 3 N en el espacio de trabajo. El ancho de banda y la rigidez del dispositivo se han evaluado mediante simulación y experimentación real. Una de las áreas más importantes en el diseño de estos dispositivos es el efector final, ya que es la parte que está en contacto con el usuario. Durante este trabajo se ha diseñado un dedal de bajo peso, adaptable a diferentes usuarios que, mediante la incorporación de sensores de contacto, permite estimar fuerzas normales y tangenciales durante la interacción con entornos reales y virtuales. Para el diseño de la arquitectura de control, se estudiaron los principales requisitos para estos dispositivos. Entre estos, cabe destacar la adquisición, procesado e intercambio a través de internet de numerosas señales de control e instrumentación; la computación de equaciones matemáticas incluyendo la cinemática directa e inversa, jacobiana, algoritmos de detección de agarres, etc. Todos estos componentes deben calcularse en tiempo real garantizando una frecuencia mínima de 1 KHz. Además, se describen sistemas para manipulación de precisión virtual y remota; así como el diseño de un método denominado "desacoplo cinemático iterativo" para computar la cinemática inversa de robots y la comparación con otros métodos actuales. Para entender la importancia de la interacción multimodal, se ha llevado a cabo un estudio para comprobar qué estímulos sensoriales se correlacionan con tiempos de respuesta más rápidos y de mayor precisión. Estos experimentos se desarrollaron en colaboración con neurocientíficos del instituto Technion Israel Institute of Technology. Comparando los tiempos de respuesta en la interacción unimodal (auditiva, visual y háptica) con combinaciones bimodales y trimodales de los mismos, se demuestra que el movimiento sincronizado de los dedos para generar respuestas de agarre se basa principalmente en la percepción háptica. La ventaja en el tiempo de procesamiento de los estímulos hápticos, sugiere que los entornos virtuales que incluyen esta componente sensorial generan mejores contingencias motoras y mejoran la credibilidad de los eventos. Se concluye que, los sistemas que incluyen percepción háptica dotan a los usuarios de más tiempo en las etapas cognitivas para rellenar información de forma creativa y formar una experiencia más rica. Una aplicación interesante de los dispositivos hápticos es el diseño de nuevos simuladores que permitan entrenar habilidades manuales en el sector médico. En colaboración con fisioterapeutas de Griffith University en Australia, se desarrolló un simulador que permite realizar ejercicios de rehabilitación de la mano. Las propiedades de rigidez no lineales de la articulación metacarpofalange del dedo índice se estimaron mediante la utilización del efector final diseñado. Estos parámetros, se han implementado en un escenario que simula el comportamiento de la mano humana y que permite la interacción háptica a través de esta interfaz. Las aplicaciones potenciales de este simulador están relacionadas con entrenamiento y educación de estudiantes de fisioterapia. En esta tesis, se han desarrollado nuevos métodos que permiten el control simultáneo de robots y manos robóticas en la interacción con entornos reales. El espacio de trabajo alcanzable por el dispositivo háptico, se extiende mediante el cambio de modo de control automático entre posición y velocidad. Además, estos métodos permiten reconocer el gesto del usuario durante las primeras etapas de aproximación al objeto para su agarre. Mediante experimentos de manipulación avanzada de objetos con un manipulador y diferentes manos robóticas, se muestra que el tiempo en realizar una tarea se reduce y que el sistema permite la realización de la tarea con precisión. Este trabajo, es el resultado de una colaboración con investigadores de Harvard BioRobotics Laboratory. ABSTRACT When we interact with the environment in our daily life (using a toothbrush, opening doors, using cell-phones, etc.), or in professional situations (medical interventions, manufacturing processes, etc.) we typically perform dexterous manipulations that involve multiple fingers and palm for both hands. Therefore, multi-Finger haptic methods can provide a realistic and natural human-machine interface to enhance immersion when interacting with simulated or remote environments. Most commercial devices allow haptic interaction with only one contact point, which may be sufficient for some exploration or palpation tasks but are not enough to perform advanced object manipulations such as grasping. In this thesis, I investigate the mechanical design, control and applications of a modular haptic device that can provide force feedback to the index, thumb and middle fingers of the user. The designed mechanical device is optimized with a multi-objective design function to achieve a low inertia, a large workspace, manipulability, and force-feedback of up to 3 N within the workspace; the bandwidth and rigidity for the device is assessed through simulation and real experimentation. One of the most important areas when designing haptic devices is the end-effector, since it is in contact with the user. In this thesis the design and evaluation of a thimble-like, lightweight, user-adaptable, and cost-effective device that incorporates four contact force sensors is described. This design allows estimation of the forces applied by a user during manipulation of virtual and real objects. The design of a real-time, modular control architecture for multi-finger haptic interaction is described. Requirements for control of multi-finger haptic devices are explored. Moreover, a large number of signals have to be acquired, processed, sent over the network and mathematical computations such as device direct and inverse kinematics, jacobian, grasp detection algorithms, etc. have to be calculated in Real Time to assure the required high fidelity for the haptic interaction. The Hardware control architecture has different modules and consists of an FPGA for the low-level controller and a RT controller for managing all the complex calculations (jacobian, kinematics, etc.); this provides a compact and scalable solution for the required high computation capabilities assuring a correct frequency rate for the control loop of 1 kHz. A set-up for dexterous virtual and real manipulation is described. Moreover, a new algorithm named the iterative kinematic decoupling method was implemented to solve the inverse kinematics of a robotic manipulator. In order to understand the importance of multi-modal interaction including haptics, a subject study was carried out to look for sensory stimuli that correlate with fast response time and enhanced accuracy. This experiment was carried out in collaboration with neuro-scientists from Technion Israel Institute of Technology. By comparing the grasping response times in unimodal (auditory, visual, and haptic) events with the response times in events with bimodal and trimodal combinations. It is concluded that in grasping tasks the synchronized motion of the fingers to generate the grasping response relies on haptic cues. This processing-speed advantage of haptic cues suggests that multimodalhaptic virtual environments are superior in generating motor contingencies, enhancing the plausibility of events. Applications that include haptics provide users with more time at the cognitive stages to fill in missing information creatively and form a richer experience. A major application of haptic devices is the design of new simulators to train manual skills for the medical sector. In collaboration with physical therapists from Griffith University in Australia, we developed a simulator to allow hand rehabilitation manipulations. First, the non-linear stiffness properties of the metacarpophalangeal joint of the index finger were estimated by using the designed end-effector; these parameters are implemented in a scenario that simulates the behavior of the human hand and that allows haptic interaction through the designed haptic device. The potential application of this work is related to educational and medical training purposes. In this thesis, new methods to simultaneously control the position and orientation of a robotic manipulator and the grasp of a robotic hand when interacting with large real environments are studied. The reachable workspace is extended by automatically switching between rate and position control modes. Moreover, the human hand gesture is recognized by reading the relative movements of the index, thumb and middle fingers of the user during the early stages of the approximation-to-the-object phase and then mapped to the robotic hand actuators. These methods are validated to perform dexterous manipulation of objects with a robotic manipulator, and different robotic hands. This work is the result of a research collaboration with researchers from the Harvard BioRobotics Laboratory. The developed experiments show that the overall task time is reduced and that the developed methods allow for full dexterity and correct completion of dexterous manipulations.
Resumo:
Introduction Diffusion weighted Imaging (DWI) techniques are able to measure, in vivo and non-invasively, the diffusivity of water molecules inside the human brain. DWI has been applied on cerebral ischemia, brain maturation, epilepsy, multiple sclerosis, etc. [1]. Nowadays, there is a very high availability of these images. DWI allows the identification of brain tissues, so its accurate segmentation is a common initial step for the referred applications. Materials and Methods We present a validation study on automated segmentation of DWI based on the Gaussian mixture and hidden Markov random field models. This methodology is widely solved with iterative conditional modes algorithm, but some studies suggest [2] that graph-cuts (GC) algorithms improve the results when initialization is not close to the final solution. We implemented a segmentation tool integrating ITK with a GC algorithm [3], and a validation software using fuzzy overlap measures [4]. Results Segmentation accuracy of each tool is tested against a gold-standard segmentation obtained from a T1 MPRAGE magnetic resonance image of the same subject, registered to the DWI space. The proposed software shows meaningful improvements by using the GC energy minimization approach on DTI and DSI (Diffusion Spectrum Imaging) data. Conclusions The brain tissues segmentation on DWI is a fundamental step on many applications. Accuracy and robustness improvements are achieved with the proposed software, with high impact on the application’s final result.
Resumo:
En los diseños y desarrollos de ingeniería, antes de comenzar la construcción e implementación de los objetivos de un proyecto, es necesario realizar una serie de análisis previos y simulaciones que corroboren las expectativas de la hipótesis inicial, con el fin de obtener una referencia empírica que satisfaga las condiciones de trabajo o funcionamiento de los objetivos de dicho proyecto. A menudo, los resultados que satisfacen las características deseadas se obtienen mediante la iteración de métodos de ensayo y error. Generalmente, éstos métodos utilizan el mismo procedimiento de análisis con la variación de una serie de parámetros que permiten adaptar una tecnología a la finalidad deseada. Hoy en día se dispone de computadoras potentes, así como algoritmos de resolución matemática que permiten resolver de forma veloz y eficiente diferentes tipos de problemas de cálculo. Resulta interesante el desarrollo de aplicaciones que permiten la resolución de éstos problemas de forma rápida y precisa en el análisis y síntesis de soluciones de ingeniería, especialmente cuando se tratan expresiones similares con variaciones de constantes, dado que se pueden desarrollar instrucciones de resolución con la capacidad de inserción de parámetros que definan el problema. Además, mediante la implementación de un código de acuerdo a la base teórica de una tecnología, se puede lograr un código válido para el estudio de cualquier problema relacionado con dicha tecnología. El desarrollo del presente proyecto pretende implementar la primera fase del simulador de dispositivos ópticos Slabsim, en cual se puede representar la distribución de la energía de una onda electromagnética en frecuencias ópticas guiada a través de una una guía dieléctrica plana, también conocida como slab. Este simulador esta constituido por una interfaz gráfica generada con el entorno de desarrollo de interfaces gráficas de usuario Matlab GUIDE, propiedad de Mathworks©, de forma que su manejo resulte sencillo e intuitivo para la ejecución de simulaciones con un bajo conocimiento de la base teórica de este tipo de estructuras por parte del usuario. De este modo se logra que el ingeniero requiera menor intervalo de tiempo para encontrar una solución que satisfaga los requisitos de un proyecto relacionado con las guías dieléctricas planas, e incluso utilizarlo para una amplia diversidad de objetivos basados en esta tecnología. Uno de los principales objetivos de este proyecto es la resolución de la base teórica de las guías slab a partir de métodos numéricos computacionales, cuyos procedimientos son extrapolables a otros problemas matemáticos y ofrecen al autor una contundente base conceptual de los mismos. Por este motivo, las resoluciones de las ecuaciones diferenciales y características que constituyen los problemas de este tipo de estructuras se realizan por estos medios de cálculo en el núcleo de la aplicación, dado que en algunos casos, no existe la alternativa de uso de expresiones analíticas útiles. ABSTRACT. The first step in engineering design and development is an analysis and simulation process which will successfully corroborate the initial hypothesis that was made and find solutions for a particular. In this way, it is possible to obtain empirical evidence which suitably substantiate the purposes of the project. Commonly, the characteristics to reach a particular target are found through iterative trial and error methods. These kinds of methods are based on the same theoretical analysis but with a variation of some parameters, with the objective to adapt the results for a particular aim. At present, powerful computers and mathematical algorithms are available to solve different kinds of calculation problems in a fast and efficient way. Computing application development is useful as it gives a high level of accurate results for engineering analysis and synthesis in short periods of time. This is more notable in cases where the mathematical expressions on a theoretical base are similar but with small variations of constant values. This is due to the ease of adaptation of the computer programming code into a parameter request system that defines a particular solution on each execution. Additionally, it is possible to code an application suitable to simulate any issue related to the studied technology. The aim of the present project consists of the construction of the first stage of an optoelectronics simulator named Slabsim. Slabism is capable of representing the energetic distribution of a light wave guided in the volume of a slab waveguide. The mentioned simulator is made through the graphic user interface development environment Matlab GUIDE, property of Mathworks©. It is designed for an easy and intuitive management by the user to execute simulations with a low knowledge of the technology theoretical bases. With this software it is possible to achieve several aims related to the slab waveguides by the user in low interval of time. One of the main purposes of this project is the mathematical solving of theoretical bases of slab structures through computing numerical analysis. This is due to the capability of adapting its criterion to other mathematical issues and provides a strong knowledge of its process. Based on these advantages, numerical solving methods are used in the core of the simulator to obtain differential and characteristic equations results that become represented on it.
Resumo:
Dimensionality Reduction (DR) is attracting more attention these days as a result of the increasing need to handle huge amounts of data effectively. DR methods allow the number of initial features to be reduced considerably until a set of them is found that allows the original properties of the data to be kept. However, their use entails an inherent loss of quality that is likely to affect the understanding of the data, in terms of data analysis. This loss of quality could be determinant when selecting a DR method, because of the nature of each method. In this paper, we propose a methodology that allows different DR methods to be analyzed and compared as regards the loss of quality produced by them. This methodology makes use of the concept of preservation of geometry (quality assessment criteria) to assess the loss of quality. Experiments have been carried out by using the most well-known DR algorithms and quality assessment criteria, based on the literature. These experiments have been applied on 12 real-world datasets. Results obtained so far show that it is possible to establish a method to select the most appropriate DR method, in terms of minimum loss of quality. Experiments have also highlighted some interesting relationships between the quality assessment criteria. Finally, the methodology allows the appropriate choice of dimensionality for reducing data to be established, whilst giving rise to a minimum loss of quality.
Resumo:
At present, all methods in Evolutionary Computation are bioinspired by the fundamental principles of neo-Darwinism, as well as by a vertical gene transfer. Virus transduction is one of the key mechanisms of horizontal gene propagation in microorganisms (e.g. bacteria). In the present paper, we model and simulate a transduction operator, exploring the possible role and usefulness of transduction in a genetic algorithm. The genetic algorithm including transduction has been named PETRI (abbreviation of Promoting Evolution Through Reiterated Infection). Our results showed how PETRI approaches higher fitness values as transduction probability comes close to 100%. The conclusion is that transduction improves the performance of a genetic algorithm, assuming a population divided among several sub-populations or ?bacterial colonies?.
Resumo:
El artículo aborda el problema del encaje de diversas imágenes de una misma escena capturadas por escáner 3d para generar un único modelo tridimensional. Para ello se utilizaron algoritmos genéticos. ABSTRACT: This work introduces a solution based on genetic algorithms to find the overlapping area between two point cloud captures obtained from a three-dimensional scanner. Considering three translation coordinates and three rotation angles, the genetic algorithm evaluates the matching points in the overlapping area between the two captures given that transformation. Genetic simulated annealing is used to improve the accuracy of the results obtained by the genetic algorithm.
Resumo:
In this article, a novel approach to deal with the design of in-building wireless networks deployments is proposed. This approach known as MOQZEA (Multiobjective Quality Zone Based Evolutionary Algorithm) is a hybr id evolutionary algorithm adapted to use a novel fitness function, based on the definition of quality zones for the different objective functions considered. This approach is conceived to solve wireless network design problems without previous information of the required number of transmitters, considering simultaneously a high number of objective functions and optimizing multiple configuration parameters of the transmitters.
Resumo:
This paper describes the objectives, content, learning methodology and results of an online course on the History of Algorithms for engineering students at Polytechnic University of Madrid (UPM). This course is conducted in a virtual environment based on Moodle, with a student-centred educational model which includes a detailed planning of learning activities. Our experience indicates that this subject is highly motivating for students and the virtual environment facilitates competencies development
Resumo:
Plant diseases represent a major economic and environmental problem in agriculture and forestry. Upon infection, a plant develops symptoms that affect different parts of the plant causing a significant agronomic impact. As many such diseases spread in time over the whole crop, a system for early disease detection can aid to mitigate the losses produced by the plant diseases and can further prevent their spread [1]. In recent years, several mathematical algorithms of search have been proposed [2,3] that could be used as a non-invasive, fast, reliable and cost-effective methods to localize in space infectious focus by detecting changes in the profile of volatile organic compounds. Tracking scents and locating odor sources is a major challenge in robotics, on one hand because odour plumes consists of non-uniform intermittent odour patches dispersed by the wind and on the other hand because of the lack of precise and reliable odour sensors. Notwithstanding, we have develop a simple robotic platform to study the robustness and effectiveness of different search algorithms [4], with respect to specific problems to be found in their further application in agriculture, namely errors committed in the motion and sensing and to the existence of spatial constraints due to land topology or the presence of obstacles.
Resumo:
The diversity of bibliometric indices today poses the challenge of exploiting the relationships among them. Our research uncovers the best core set of relevant indices for predicting other bibliometric indices. An added difficulty is to select the role of each variable, that is, which bibliometric indices are predictive variables and which are response variables. This results in a novel multioutput regression problem where the role of each variable (predictor or response) is unknown beforehand. We use Gaussian Bayesian networks to solve the this problem and discover multivariate relationships among bibliometric indices. These networks are learnt by a genetic algorithm that looks for the optimal models that best predict bibliometric data. Results show that the optimal induced Gaussian Bayesian networks corroborate previous relationships between several indices, but also suggest new, previously unreported interactions. An extended analysis of the best model illustrates that a set of 12 bibliometric indices can be accurately predicted using only a smaller predictive core subset composed of citations, g-index, q2-index, and hr-index. This research is performed using bibliometric data on Spanish full professors associated with the computer science area.
Resumo:
One of the most promising areas in which probabilistic graphical models have shown an incipient activity is the field of heuristic optimization and, in particular, in Estimation of Distribution Algorithms. Due to their inherent parallelism, different research lines have been studied trying to improve Estimation of Distribution Algorithms from the point of view of execution time and/or accuracy. Among these proposals, we focus on the so-called distributed or island-based models. This approach defines several islands (algorithms instances) running independently and exchanging information with a given frequency. The information sent by the islands can be either a set of individuals or a probabilistic model. This paper presents a comparative study for a distributed univariate Estimation of Distribution Algorithm and a multivariate version, paying special attention to the comparison of two alternative methods for exchanging information, over a wide set of parameters and problems ? the standard benchmark developed for the IEEE Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems of the ISDA 2009 Conference. Several analyses from different points of view have been conducted to analyze both the influence of the parameters and the relationships between them including a characterization of the configurations according to their behavior on the proposed benchmark.