865 resultados para Ant-based algorithm


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mit der Verwirklichung ,Ökologischer Netzwerke‘ werden Hoffnungen zum Stopp des Verlustes der biologischen Vielfalt verknüpft. Sowohl auf gesamteuropäischer Ebene (Pan-European Ecological Network - PEEN) als auch in den einzelnen Staaten entstehen Pläne zum Aufbau von Verbundsystemen. Im föderalen Deutschland werden kleinmaßstäbliche Biotopverbundplanungen auf Landesebene aufgestellt; zum nationalen Biotopverbund bestehen erste Konzepte. Die vorliegende Arbeit ist auf diese überörtlichen, strategisch vorbereitenden Planungsebenen ausgerichtet. Ziele des Verbunds sind der Erhalt von Populationen insbesondere der gefährdeten Arten sowie die Ermöglichung von Ausbreitung und Wanderung. Aufgrund fehlender Datengrundlagen zu den Arten und Populationen ist es nicht ohne weiteres möglich, die Konzepte und Modelle der Populationsökologie in die überörtlichen Planungsebenen zu übertragen. Gemäß der o.g. Zielstellungen sollte sich aber die Planung von Verbundsystemen an den Ansprüchen der auf Verbund angewiesenen Arten orientieren. Ziel der Arbeit war die Entwicklung einer praktikablen GIS-gestützten Planungshilfe zur größtmöglichen Integration ökologischen Wissens unter der Bedingung eingeschränkter Informationsverfügbarkeit. Als Grundlagen dazu werden in Übersichtsform zunächst die globalen, europäisch-internationalen und nationalen Rahmenbedingungen und Anforderungen bezüglich des Aufbaus von Verbundsystemen zusammengestellt. Hier sind die Strategien zum PEEN hervorzuheben, die eine Integration ökologischer Inhalte insbesondere durch die Berücksichtigung räumlich-funktionaler Beziehungen fordern. Eine umfassende Analyse der landesweiten Biotopverbundplanungen der BRD zeigte die teilweise erheblichen Unterschiede zwischen den Länderplanungen auf, die es aktuell nicht ermöglichen, ein schlüssiges nationales Konzept zusammenzufügen. Nicht alle Länder haben landesweite Biotopverbundplanungen und Landeskonzepte, bei denen dem geplanten Verbund die Ansprüche von Arten zugrunde gelegt werden, gibt es nur ansatzweise. Weiterhin wurde eine zielgerichtete Eignungsprüfung bestehender GIS-basierter Modelle und Konzepte zum Verbund unter Berücksichtigung der regelmäßig in Deutschland verfügbaren Datengrundlagen durchgeführt. Da keine integrativen regelorientierten Ansätze vorhanden waren, wurde der vektorbasierte Algorithmus HABITAT-NET entwickelt. Er arbeitet mit ,Anspruchstypen‘ hinsichtlich des Habitatverbunds, die stellvertretend für unterschiedliche ökologische Gruppen von (Ziel-) Arten mit terrestrischer Ausbreitung stehen. Kombiniert wird die Fähigkeit zur Ausbreitung mit einer Grobtypisierung der Biotopbindung. Die wichtigsten Grundlagendaten bilden die jeweiligen (potenziellen) Habitate von Arten eines Anspruchstyps sowie die umgebende Landnutzung. Bei der Bildung von ,Lebensraumnetzwerken‘ (Teil I) werden gestufte ,Funktions- und Verbindungsräume‘ generiert, die zu einem räumlichen System verknüpft sind. Anschließend kann die aktuelle Zerschneidung der Netzwerke durch Verkehrstrassen aufgezeigt werden, um darauf aufbauend prioritäre Abschnitte zur Wiedervernetzung zu ermitteln (Teil II). Begleitend wird das Konzept der unzerschnittenen Funktionsräume (UFR) entworfen, mit dem die Indikation von Habitatzerschneidung auf Landschaftsebene möglich ist. Diskutiert werden schließlich die Eignung der Ergebnisse als kleinmaßstäblicher Zielrahmen, Tests zur Validierung, Vergleiche mit Verbundplanungen und verschiedene Setzungen im GIS-Algorithmus. Erläuterungen zu den Einsatzmöglichkeiten erfolgen beispielsweise für die Bereiche Biotopverbund- und Landschaftsplanung, Raumordnung, Strategische Umweltprüfung, Verkehrswegeplanung, Unterstützung des Konzeptes der Lebensraumkorridore, Kohärenz im Schutzgebietssystem NATURA 2000 und Aufbau von Umweltinformationssystemen. Schließlich wird ein Rück- und Ausblick mit der Formulierung des weiteren Forschungsbedarfs verknüpft.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot. Although the dominant approach, when using RL, has been to apply value function based algorithms, the system here detailed is characterized by the use of direct policy search methods. Rather than approximating a value function, these methodologies approximate a policy using an independent function approximator with its own parameters, trying to maximize the future expected reward. The policy based algorithm presented in this paper is used for learning the internal state/action mapping of a behavior. In this preliminary work, we demonstrate its feasibility with simulated experiments using the underwater robot GARBI in a target reaching task

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La optimización de sistemas y modelos se ha convertido en uno de los factores más importantes a la hora de buscar la mayor eficiencia de un proceso. Este concepto no es ajeno al transporte escolar, ambiente que cambia constantemente al ritmo de las necesidades de sus clientes, y que responde ante una fuerte responsabilidad frente a sus usuarios, los niños que hacen uso del servicio, en cuanto al cumplimiento de tiempos y seguridad, mientras busca constantemente la reducción de costos. Este proyecto expone las problemáticas presentadas en The English School en esta área y propone un modelo de optimización simple que permitirá notables mejoras en términos de tiempos y costos, de tal forma que genere beneficios para la institución en términos financieros y de satisfacción al cliente. Por medio de la implementación de este modelo será posible identificar errores comunes del proceso, se identificarán soluciones prácticas de fácil aplicación en el manejo del transporte y se presentarán los resultados obtenidos en la muestra utilizada para desarrollar el proyecto.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The nonlinearity of high-power amplifiers (HPAs) has a crucial effect on the performance of multiple-input-multiple-output (MIMO) systems. In this paper, we investigate the performance of MIMO orthogonal space-time block coding (OSTBC) systems in the presence of nonlinear HPAs. Specifically, we propose a constellation-based compensation method for HPA nonlinearity in the case with knowledge of the HPA parameters at the transmitter and receiver, where the constellation and decision regions of the distorted transmitted signal are derived in advance. Furthermore, in the scenario without knowledge of the HPA parameters, a sequential Monte Carlo (SMC)-based compensation method for the HPA nonlinearity is proposed, which first estimates the channel-gain matrix by means of the SMC method and then uses the SMC-based algorithm to detect the desired signal. The performance of the MIMO-OSTBC system under study is evaluated in terms of average symbol error probability (SEP), total degradation (TD) and system capacity, in uncorrelated Nakagami-m fading channels. Numerical and simulation results are provided and show the effects on performance of several system parameters, such as the parameters of the HPA model, output back-off (OBO) of nonlinear HPA, numbers of transmit and receive antennas, modulation order of quadrature amplitude modulation (QAM), and number of SMC samples. In particular, it is shown that the constellation-based compensation method can efficiently mitigate the effect of HPA nonlinearity with low complexity and that the SMC-based detection scheme is efficient to compensate for HPA nonlinearity in the case without knowledge of the HPA parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper addresses the issue of activity understanding from video and its semantics-rich description. A novel approach is presented where activities are characterised and analysed at different resolutions. Semantic information is delivered according to the resolution at which the activity is observed. Furthermore, the multiresolution activity characterisation is exploited to detect abnormal activity. To achieve these system capabilities, the focus is given on context modelling by employing a soft computing-based algorithm which automatically enables the determination of the main activity zones of the observed scene by taking as input the trajectories of detected mobiles. Such areas are learnt at different resolutions (or granularities). In a second stage, learned zones are employed to extract people activities by relating mobile trajectories to the learned zones. In this way, the activity of a person can be summarised as the series of zones that the person has visited. Employing the inherent soft relation properties, the reported activities can be labelled with meaningful semantics. Depending on the granularity at which activity zones and mobile trajectories are considered, the semantic meaning of the activity shifts from broad interpretation to detailed description.Activity information at different resolutions is also employed to perform abnormal activity detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate several two-dimensional guillotine cutting stock problems and their variants in which orthogonal rotations are allowed. We first present two dynamic programming based algorithms for the Rectangular Knapsack (RK) problem and its variants in which the patterns must be staged. The first algorithm solves the recurrence formula proposed by Beasley; the second algorithm - for staged patterns - also uses a recurrence formula. We show that if the items are not so small compared to the dimensions of the bin, then these algorithms require polynomial time. Using these algorithms we solved all instances of the RK problem found at the OR-LIBRARY, including one for which no optimal solution was known. We also consider the Two-dimensional Cutting Stock problem. We present a column generation based algorithm for this problem that uses the first algorithm above mentioned to generate the columns. We propose two strategies to tackle the residual instances. We also investigate a variant of this problem where the bins have different sizes. At last, we study the Two-dimensional Strip Packing problem. We also present a column generation based algorithm for this problem that uses the second algorithm above mentioned where staged patterns are imposed. In this case we solve instances for two-, three- and four-staged patterns. We report on some computational experiments with the various algorithms we propose in this paper. The results indicate that these algorithms seem to be suitable for solving real-world instances. We give a detailed description (a pseudo-code) of all the algorithms presented here, so that the reader may easily implement these algorithms. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a new approach and coding scheme for solving economic dispatch problems (ED) in power systems through an effortless hybrid method (EHM). This novel coding scheme can effectively prevent futile searching and also prevents obtaining infeasible solutions through the application of stochastic search methods, consequently dramatically improves search efficiency and solution quality. The dominant constraint of an economic dispatch problem is power balance. The operational constraints, such as generation limitations, ramp rate limits, prohibited operating zones (POZ), network loss are considered for practical operation. Firstly, in the EHM procedure, the output of generator is obtained with a lambda iteration method and without considering POZ and later in a genetic based algorithm this constraint is satisfied. To demonstrate its efficiency, feasibility and fastness, the EHM algorithm was applied to solve constrained ED problems of power systems with 6 and 15 units. The simulation results obtained from the EHM were compared to those achieved from previous literature in terms of solution quality and computational efficiency. Results reveal that the superiority of this method in both aspects of financial and CPU time. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N  M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A comprehensive analysis of electrodisintegration yields of protons on Zr90 is proposed taking into account the giant dipole resonance, isovector giant quadrupole resonance (IVGQR), and quasideuteron contributions to the total photoabsorption cross section from 10 to 140 MeV. The calculation applies the MCMC intranuclear cascade to address the direct and pre-equilibrium emissions and another Monte Carlo-based algorithm to describe the evaporation step. The final results of the total photoabsorption cross section for Zr90 and relevant decay channels are obtained by fitting the (e,p) measurements from the National Bureau of Standards and show that multiple proton emissions dominate the photonuclear reactions at higher energies. These results provide a consistent explanation for the exotic and steady increase of the (e,p) yield and also a strong evidence of a IVGQR with a strength parameter compatible with the E2 energy-weighted sum rule. The inclusive photoneutron cross sections for Zr90 and natZr, derived from these results and normalized with the (e,p) data, are in agreement within 10% with both Livermore and Saclay data up to 140 MeV. © 2007 The American Physical Society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new approach based on a N-a cluster photoabsorption model is proposed for the understanding of the puzzling steady increase behavior of the 90Zr (e, α) yield measured at the National Bureau of Standards (NBS) within the Giant Dipole Resonance and quasideuteron energy range. The calculation takes into account the pre-equilibrium emissions of protons, neutrons and alpha particles in the framework of an extended version of the multicollisional intranuclear cascade model (MCMC). Another Monte Carlo based algorithm describes the statistical decay of the compound nucleus in terms of the competition between particle evaporation (p, n, d, α, 3He and t) and nuclear fission. The results reproduce quite successfully the 90Zr (e,α) yield, suggesting that emissions of a particles are essential for the interpretation of the exotic increase of the cross sections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of new technologies that use peer-to-peer networks grows every day, with the object to supply the need of sharing information, resources and services of databases around the world. Among them are the peer-to-peer databases that take advantage of peer-to-peer networks to manage distributed knowledge bases, allowing the sharing of information semantically related but syntactically heterogeneous. However, it is a challenge to ensure the efficient search for information without compromising the autonomy of each node and network flexibility, given the structural characteristics of these networks. On the other hand, some studies propose the use of ontology semantics by assigning standardized categorization of information. The main original contribution of this work is the approach of this problem with a proposal for optimization of queries supported by the Ant Colony algorithm and classification though ontologies. The results show that this strategy enables the semantic support to the searches in peer-to-peer databases, aiming to expand the results without compromising network performance. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pattern recognition in large amount of data has been paramount in the last decade, since that is not straightforward to design interactive and real time classification systems. Very recently, the Optimum-Path Forest classifier was proposed to overcome such limitations, together with its training set pruning algorithm, which requires a parameter that has been empirically set up to date. In this paper, we propose a Harmony Search-based algorithm that can find near optimal values for that. The experimental results have showed that our algorithm is able to find proper values for the OPF pruning algorithm parameter. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study proposes a constrained least square (CLS) pre-distortion scheme for multiple-input single-output (MISO) multiple access ultra-wideband (UWB) systems. In such a scheme, a simple objective function is defined, which can be efficiently solved by a gradient-based algorithm. For the performance evaluation, scenarios CM1 and CM3 of the IEEE 802.15.3a channel model are considered. Results show that the CLS algorithm has a fast convergence and a good trade-off between intersymbol interference (ISI) and multiple access interference (MAI) reduction and signal-to-noise ratio (SNR) preservation, performing better than time-reversal (TR) pre-distortion.