847 resultados para Logic-based optimization algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A pioneer team of students of the University of Girona decided to design and develop an autonomous underwater vehicle (AUV) called ICTINEU-AUV to face the Student Autonomous Underwater Challenge-Europe (SAUC-E). The prototype has evolved from the initial computer aided design (CAD) model to become an operative AUV in the short period of seven months. The open frame and modular design principles together with the compatibility with other robots previously developed at the lab have provided the main design philosophy. Hence, at the robot's core, two networked computers give access to a wide set of sensors and actuators. The Gentoo/Linux distribution was chosen as the onboard operating system. A software architecture based on a set of distributed objects with soft real time capabilities was developed and a hybrid control architecture including mission control, a behavioural layer and a robust map-based localization algorithm made ICTINEU-AUV the winning entry

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we describe a system for underwater navigation with AUVs in partially structured environments, such as dams, ports or marine platforms. An imaging sonar is used to obtain information about the location of planar structures present in such environments. This information is incorporated into a feature-based SLAM algorithm in a two step process: (I) the full 360deg sonar scan is undistorted (to compensate for vehicle motion), thresholded and segmented to determine which measurements correspond to planar environment features and which should be ignored; and (2) SLAM proceeds once the data association is obtained: both the vehicle motion and the measurements whose correct association has been previously determined are incorporated in the SLAM algorithm. This two step delayed SLAM process allows to robustly determine the feature and vehicle locations in the presence of large amounts of spurious or unrelated measurements that might correspond to boats, rocks, etc. Preliminary experiments show the viability of the proposed approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a navigation system for autonomous underwater vehicles (AUVs) in partially structured environments, such as dams, harbors, marinas or marine platforms. A mechanical scanning imaging sonar is used to obtain information about the location of planar structures present in such environments. A modified version of the Hough transform has been developed to extract line features, together with their uncertainty, from the continuous sonar dataflow. The information obtained is incorporated into a feature-based SLAM algorithm running an Extended Kalman Filter (EKF). Simultaneously, the AUV's position estimate is provided to the feature extraction algorithm to correct the distortions that the vehicle motion produces in the acoustic images. Experiments carried out in a marina located in the Costa Brava (Spain) with the Ictineu AUV show the viability of the proposed approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les Mesures de Semblança Quàntica Molecular (MSQM) requereixen la maximització del solapament de les densitats electròniques de les molècules que es comparen. En aquest treball es presenta un algorisme de maximització de les MSQM, que és global en el límit de densitats electròniques deformades a funcions deltes de Dirac. A partir d'aquest algorisme se'n deriva l'equivalent per a densitats no deformades

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis proposes a solution to the problem of estimating the motion of an Unmanned Underwater Vehicle (UUV). Our approach is based on the integration of the incremental measurements which are provided by a vision system. When the vehicle is close to the underwater terrain, it constructs a visual map (so called "mosaic") of the area where the mission takes place while, at the same time, it localizes itself on this map, following the Concurrent Mapping and Localization strategy. The proposed methodology to achieve this goal is based on a feature-based mosaicking algorithm. A down-looking camera is attached to the underwater vehicle. As the vehicle moves, a sequence of images of the sea-floor is acquired by the camera. For every image of the sequence, a set of characteristic features is detected by means of a corner detector. Then, their correspondences are found in the next image of the sequence. Solving the correspondence problem in an accurate and reliable way is a difficult task in computer vision. We consider different alternatives to solve this problem by introducing a detailed analysis of the textural characteristics of the image. This is done in two phases: first comparing different texture operators individually, and next selecting those that best characterize the point/matching pair and using them together to obtain a more robust characterization. Various alternatives are also studied to merge the information provided by the individual texture operators. Finally, the best approach in terms of robustness and efficiency is proposed. After the correspondences have been solved, for every pair of consecutive images we obtain a list of image features in the first image and their matchings in the next frame. Our aim is now to recover the apparent motion of the camera from these features. Although an accurate texture analysis is devoted to the matching pro-cedure, some false matches (known as outliers) could still appear among the right correspon-dences. For this reason, a robust estimation technique is used to estimate the planar transformation (homography) which explains the dominant motion of the image. Next, this homography is used to warp the processed image to the common mosaic frame, constructing a composite image formed by every frame of the sequence. With the aim of estimating the position of the vehicle as the mosaic is being constructed, the 3D motion of the vehicle can be computed from the measurements obtained by a sonar altimeter and the incremental motion computed from the homography. Unfortunately, as the mosaic increases in size, image local alignment errors increase the inaccuracies associated to the position of the vehicle. Occasionally, the trajectory described by the vehicle may cross over itself. In this situation new information is available, and the system can readjust the position estimates. Our proposal consists not only in localizing the vehicle, but also in readjusting the trajectory described by the vehicle when crossover information is obtained. This is achieved by implementing an Augmented State Kalman Filter (ASKF). Kalman filtering appears as an adequate framework to deal with position estimates and their associated covariances. Finally, some experimental results are shown. A laboratory setup has been used to analyze and evaluate the accuracy of the mosaicking system. This setup enables a quantitative measurement of the accumulated errors of the mosaics created in the lab. Then, the results obtained from real sea trials using the URIS underwater vehicle are shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different optimization methods can be employed to optimize a numerical estimate for the match between an instantiated object model and an image. In order to take advantage of gradient-based optimization methods, perspective inversion must be used in this context. We show that convergence can be very fast by extrapolating to maximum goodness-of-fit with Newton's method. This approach is related to methods which either maximize a similar goodness-of-fit measure without use of gradient information, or else minimize distances between projected model lines and image features. Newton's method combines the accuracy of the former approach with the speed of convergence of the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we prove a weak Noether-type Theorem for a class of variational problems that admit broken extremals. We use this result to prove discrete Noether-type conservation laws for a conforming finite element discretisation of a model elliptic problem. In addition, we study how well the finite element scheme satisfies the continuous conservation laws arising from the application of Noether’s first theorem (1918). We summarise extensive numerical tests, illustrating the conservation of the discrete Noether law using the p-Laplacian as an example and derive a geometric-based adaptive algorithm where an appropriate Noether quantity is the goal functional.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we describe a new protocol that we call the Curry-Howard protocol between a theory and the programs extracted from it. This protocol leads to the expansion of the theory and the production of more powerful programs. The methodology we use for automatically extracting “correct” programs from proofs is a development of the well-known Curry-Howard process. Program extraction has been developed by many authors, but our presentation is ultimately aimed at a practical, usable system and has a number of novel features. These include 1. a very simple and natural mimicking of ordinary mathematical practice and likewise the use of established computer programs when we obtain programs from formal proofs, and 2. a conceptual distinction between programs on the one hand, and proofs of theorems that yield programs on the other. An implementation of our methodology is the Fred system. As an example of our protocol we describe a constructive proof of the well-known theorem that every graph of even parity can be decomposed into a list of disjoint cycles. Given such a graph as input, the extracted program produces a list of the (non-trivial) disjoint cycles as promised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guias para exploração mineral são normalmente baseados em modelos conceituais de depósitos. Esses guias são, normalmente, baseados na experiência dos geólogos, em dados descritivos e em dados genéticos. Modelamentos numéricos, probabilísticos e não probabilísticos, para estimar a ocorrência de depósitos minerais é um novo procedimento que vem a cada dia aumentando sua utilização e aceitação pela comunidade geológica. Essa tese utiliza recentes metodologias para a geração de mapas de favorablidade mineral. A denominada Ilha Cristalina de Rivera, uma janela erosional da Bacia do Paraná, situada na porção norte do Uruguai, foi escolhida como estudo de caso para a aplicação das metodologias. A construção dos mapas de favorabilidade mineral foi feita com base nos seguintes tipos de dados, informações e resultados de prospecção: 1) imagens orbitais; 2) prospecção geoquimica; 3) prospecção aerogeofísica; 4) mapeamento geo-estrutural e 5) altimetria. Essas informacões foram selecionadas e processadas com base em um modelo de depósito mineral (modelo conceitual), desenvolvido com base na Mina de Ouro San Gregorio. O modelo conceitual (modelo San Gregorio), incluiu características descritivas e genéticas da Mina San Gregorio, a qual abrange os elementos característicos significativos das demais ocorrências minerais conhecidas na Ilha Cristalina de Rivera. A geração dos mapas de favorabilidade mineral envolveu a construção de um banco de dados, o processamento dos dados, e a integração dos dados. As etapas de construção e processamento dos dados, compreenderam a coleta, a seleção e o tratamento dos dados de maneira a constituírem os denominados Planos de Informação. Esses Planos de Informação foram gerados e processados organizadamente em agrupamentos, de modo a constituírem os Fatores de Integração para o mapeamento de favorabilidade mineral na Ilha Cristalina de Rivera. Os dados foram integrados por meio da utilização de duas diferentes metodologias: 1) Pesos de Evidência (dirigida pelos dados) e 2) Lógica Difusa (dirigida pelo conhecimento). Os mapas de favorabilidade mineral resultantes da implementação das duas metodologias de integração foram primeiramente analisados e interpretados de maneira individual. Após foi feita uma análise comparativa entre os resultados. As duas metodologias xxiv obtiveram sucesso em identificar, como áreas de alta favorabilidade, as áreas mineralizadas conhecidas, além de outras áreas ainda não trabalhadas. Os mapas de favorabilidade mineral resultantes das duas metodologias mostraram-se coincidentes em relação as áreas de mais alta favorabilidade. A metodologia Pesos de Evidência apresentou o mapa de favorabilidade mineral mais conservador em termos de extensão areal, porém mais otimista em termos de valores de favorabilidade em comparação aos mapas de favorabilidade mineral resultantes da implementação da metodologia Lógica Difusa. Novos alvos para exploração mineral foram identificados e deverão ser objeto de investigação em detalhe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho tratar sobre estratégia e conhecimento, questionando o porquê das organizações deixarem de aproveitar os ativos de conhecimentos que transcendem aos definidos pela estratégia organizacional e que podem gerar valor adicional para seus stakeholders expressivos. Esses ativos de conhecimentos que estão além dos exigidos pelos cargos e funções, aqui denominados de “excedentes cognitivos”, permitiriam induzir flexibilidade estratégica, criar recursos intangíveis distintivos e gerar valor adicional. Entretanto, a lógica da estratégia tradicional, baseada na eficiência e na racionalidade, define os conhecimentos demandados e cria, via de regra, uma rigidez estratégica que limita o espaço para as contribuições decorrentes dos excedentes cognitivos que poderão não estar relacionados diretamente aos objetivos e metas estabelecidas no plano estratégico. Outra lógica baseada nas culturas da participação, abundância e generosidade é trazida à consideração, na qual as seguintes condições de contorno podem ser observadas: (i) os excedentes cognitivos podem ser identificados como raros, valiosos, não substituíveis, de difícil imitação e decorrentes de longos processos de desenvolvimento, logo, podem distinguir a organização das demais do setor; (ii) os excedentes cognitivos podem induzir novas dinâmicas de funcionamentos para os contextos capacitantes; e, (iii) os contextos capacitantes podem atuar como locais propícios à evidenciação, mobilização e uso desses excedentes. A pesquisa valeu-se, predominantemente, do método, procedimentos e técnicas qualitativas. Foram entrevistados especialistas que atuam como pesquisadores e consultores, com reconhecida projeção nas áreas de Estratégia, Gestão do Conhecimento e Gestão de Pessoas. Os resultados evidenciaram como as organizações lidam com os excedentes cognitivos; o que as organizações poderiam fazer com os excedentes cognitivos; as barreiras que se erguem à evidenciação, mobilização e uso desses excedentes; o que rege a criação dos atuais contextos capacitantes; a existência ou não de espaços de interação construídos intencionalmente pelas organizações; as atitudes dos integrantes das organizações em relação aos excedentes cognitivos; e a percepção dos benefícios que os excedentes cognitivos podem trazer para os stakeholders expressivos da organização.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The internet is a transbordering and potentializing environment for the information, since it makes possible the sheering, distribution and interaction of the contents available in it. However, this information system may generate an opposing move when it produces an avalanche of superficial information which difficult the absorption criticism by the user. This modern-liquid society, which is characterized by short living fashionisms, creates a fluid subject on which its habits do not become concrete, since they are so temporary that they don t shape up. The information also reproduces the same scenario, since the user is inserted into a logic based on supplying information and so it is conditioned to consume, not absorb or transform them into knowledge, since the flow of content production does not allow it. It is in his context that the publishing of cultural expressions come to be questioned, since they follow a liquid society trend. This discussion will take on topics that approach diverse cultural expressions in Sergipe, such as cinema, theater, craftsmanship, events, memory spaces(museums, art galleries, memorials, files, libraries, history institutes, science academies), amongst others and will analyze the content production of the Infonet Portal, from the reports published during April, May and June of 2008, period considered to be of cultural turbulence in the state, due to the June Festivals (Saint john, Saint Joseph and Saint Peter). To do so, Zigmunt Bauman s, Pierre Levy s, Edgar Morin s and Dominique Wolton s ideas were discussed. Besides, were analyzed the characteristics of the internet and its applicability within the portal hereby discussed in order to perceive the way the information is produced in the cyber culture, a movement that allows memorization, potentialization, interaction , besides other criteria inherent to the cyberspace

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Nossa Senhora da Conceição Seminary, installed in 1894, by Dom Adauto Aurélio de Miranda Henriques, first Paraíba Bishop, and the Episcopal Seminary of the Sagrado Coração de Jesus, implanted in 1913, by Dom José Thomas Gomes da Silva, first Aracaju s Bishop diocese, were created as a result of lack of an official religious process proposed by the Brazilian Republic Proclamation, in 1889. With the appoint to enlarge the number of priests and change the image of the priest married and unrolled who used to identify the Catholic Church in the colonial and imperial Brazil. Such bishops developed into intellectuals in the government, dioceses and formation priest houses. I take as a study object, for this doctorate paper, the academic formation and priesthood developed in theses seminaries, from 1894 to 1933, once 1894 the year of João Pessoa Creation Seminar that was implied the Minor Course (preparation) and the Major one (built by Philosophy and God related studies) and the research limit year of 1933, is concerned about the Major Sergipe Seminary ending, which was created and has worked offering the Minor and Major courses, from 1913 to 1933. Showing the teaching models that guided and leaded the priest formation, referred as Seminaries, and the application result is the objective of this investigation. To comprehend the teaching models seminaries studied, my research line is the Catholic Church theme and priest formation in Brazil. In front of the object and the objective desired, I chose the historical comparative method and the scholars modals notions of Araujo de Barros (2004) and the Sirinelli intellectuals (1996). Such references allowed me to analyze the formation given in the seminary and seminarian participation and actions, included the sequence after the scholars formation. The thesis defended is that the teaching model developed in the Brazilian Seminaries, created after a non official religious process in the Brazilian government, deal with a model of one unique center (Seminary formation and aim pre arranged by Santa Sé), although adapted, presuming the local reality and formation structure (privileged not only spiritual and moral speaking, but intellectual also), was it responsible for intellectuals generations (teachers priests, educationalist priest, journalists priests and so on) that boost the education in Brazil. During the Republic first three decades, when, in thesis, the Government was becoming free religion, i.e., the government did not subsidize the Church anymore, and the Government, among others aspects, did not received any Church care to help the public teaching in the country. The investigation reveled accede, by bishops and their followers, such as by the Concílio de Trento pre concept, or by the others ideas, leading by the priests formation in Seminaries. By creating and stalling diocese Seminary, Bishop Dom Adauto and Dom José went further their functions, by the time they built inside themselves a teaching model thought from the main pedagogic logic, based on several religious exercises, moral and ethic, considered by themselves several knowledge connected to humanity, philosophy and God related studies). Following clearly rationalism principle (the way of teaching, which each subject has its own teacher and this class get together students with the same knowledge, regardless of age) and efficiency (trying to teach the whole content in each class), the Seminaries researched developed a whole education, allowed the structure of a spiritual education, moral and intellectual, for a quality developed by priests, including different levels that they used to performance. Their bottom line, actions and priest matter achievement allowed their broad fulfillment, in the way that priests matter were associated with cultural, educational, welfare assistance, at last, intellectuals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective in the facility location problem with limited distances is to minimize the sum of distance functions from the facility to the customers, but with a limit on each distance, after which the corresponding function becomes constant. The problem has applications in situations where the service provided by the facility is insensitive after a given threshold distance (eg. fire station location). In this work, we propose a global optimization algorithm for the case in which there are lower and upper limits on the numbers of customers that can be served