171 resultados para algorismes
Resumo:
Este proyecto consiste en diseñar el algoritmo de control de un autogiro no tripulado. Su aplicación principal es llevar a cabo tareas rutinarias o peligrosas para el piloto como, por ejemplo, extinción de incendios, evaluación de riesgo químico o vigilancia de lugares de acceso restringido. Se realiza un estudio del movimiento del vehículo para obtener su modelo dinámico. A partir de las ecuaciones que describen su movimiento, se realiza una simulación numérica del vehículo. Se incorpora el controlador diseñado y se evalúa su funcionamiento. Finalmente, se implementa el sistema en un microcontrolador.
Resumo:
Els sistemes híbrids de navegació integren mesures de posició i velocitat provinents de satèl·lits (GPS) i d’unitats de mesura inercials (IMU).Les dades d’aquests sensors s’han de fusionar i suavitzar, i per a aquest propòsit existeixen diversos algorismes de filtratge, que tracten les dades conjuntament o per separat. En aquest treball s’han codificat en Matlab els algorismes dels filtres de Kalman i IMM, i s’han comparat les seves prestacions en diverses trajectòries d’un vehicle. S’han avaluat quantitativament els errors dels dos filtres, i s’han sintonitzat els seus paràmetres per a minimitzar aquests errors. Amb una correcta sintonia dels filtres, s’ha comprovat que el filtre IMM és superior al filtre de Kalman, tant per maniobres brusques com per maniobres suaus, malgrat que la complexitat i el temps de càlcul requerit són majors.
Resumo:
La principal motivació d'aquest treball ha estat implementar l'algoritme Rijndael-AES en un full Sage-math, paquet de software matemàtic de lliure distribució i en actual desenvolupament, aprofitant les seves eines i funcionalitats integrades.
Resumo:
Implementació del disseny d'una aplicació de còmput distribuït peer-to-peer. Es porta a terme la creació d'un model del sistema. La implementació dels algorismes o polítiques que el formen i la simulació del mateix.
Resumo:
Neuronal dynamics are fundamentally constrained by the underlying structural network architecture, yet much of the details of this synaptic connectivity are still unknown even in neuronal cultures in vitro. Here we extend a previous approach based on information theory, the Generalized Transfer Entropy, to the reconstruction of connectivity of simulated neuronal networks of both excitatory and inhibitory neurons. We show that, due to the model-free nature of the developed measure, both kinds of connections can be reliably inferred if the average firing rate between synchronous burst events exceeds a small minimum frequency. Furthermore, we suggest, based on systematic simulations, that even lower spontaneous inter-burst rates could be raised to meet the requirements of our reconstruction algorithm by applying a weak spatially homogeneous stimulation to the entire network. By combining multiple recordings of the same in silico network before and after pharmacologically blocking inhibitory synaptic transmission, we show then how it becomes possible to infer with high confidence the excitatory or inhibitory nature of each individual neuron.
Resumo:
En els darrers anys, la criptografia amb corbes el.líptiques ha adquirit una importància creixent, fins a arribar a formar part en la actualitat de diferents estàndards industrials. Tot i que s'han dissenyat variants amb corbes el.líptiques de criptosistemes clàssics, com el RSA, el seu màxim interès rau en la seva aplicació en criptosistemes basats en el Problema del Logaritme Discret, com els de tipus ElGamal. En aquest cas, els criptosistemes el.líptics garanteixen la mateixa seguretat que els construïts sobre el grup multiplicatiu d'un cos finit primer, però amb longituds de clau molt menor. Mostrarem, doncs, les bones propietats d'aquests criptosistemes, així com els requeriments bàsics per a que una corba sigui criptogràficament útil, estretament relacionat amb la seva cardinalitat. Revisarem alguns mètodes que permetin descartar corbes no criptogràficament útils, així com altres que permetin obtenir corbes bones a partir d'una de donada. Finalment, descriurem algunes aplicacions, com són el seu ús en Targes Intel.ligents i sistemes RFID, per concloure amb alguns avenços recents en aquest camp.
Resumo:
Background: Parallel T-Coffee (PTC) was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results: In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions: The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.
Resumo:
This article introduces a new interface for T-Coffee, a consistency-based multiple sequence alignment program. This interface provides an easy and intuitive access to the most popular functionality of the package. These include the default T-Coffee mode for protein and nucleic acid sequences, the M-Coffee mode that allows combining the output of any other aligners, and template-based modes of T-Coffee that deliver high accuracy alignments while using structural or homology derived templates. These three available template modes are Expresso for the alignment of protein with a known 3D-Structure, R-Coffee to align RNA sequences with conserved secondary structures and PSI-Coffee to accurately align distantly related sequences using homology extension. The new server benefits from recent improvements of the T-Coffee algorithm and can align up to 150 sequences as long as 10 000 residues and is available from both http://www.tcoffee.org and its main mirror http://tcoffee.crg.cat.
Resumo:
We present a new branch and bound algorithm for weighted Max-SAT, called Lazy which incorporates original data structures and inference rules, as well as a lower bound of better quality. We provide experimental evidence that our solver is very competitive and outperforms some of the best performing Max-SAT and weighted Max-SAT solvers on a wide range of instances.
Resumo:
The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.
Resumo:
The analysis of the shape of excitation-emission matrices (EEMs) is a relevant tool for exploring the origin, transport and fate of dissolved organic matter (DOM) in aquatic ecosystems. Within this context, the decomposition of EEMs is acquiring a notable relevance. A simple mathematical algorithm that automatically deconvolves individual EEMs is described, creating new possibilities for the comparison of DOM fluorescence properties and EEMs that are very different from each other. A mixture model approach is adopted to decompose complex surfaces into sub-peaks. The laplacian operator and the Nelder-Mead optimisation algorithm are implemented to individuate and automatically locate potential peaks in the EEM landscape. The EEMs of a simple artificial mixture of fluorophores and DOM samples collected in a Mediterranean river are used to describe the model application and to illustrate a strategy that optimises the search for the optimal output.
Resumo:
Langevin Equations of Ginzburg-Landau form, with multiplicative noise, are proposed to study the effects of fluctuations in domain growth. These equations are derived from a coarse-grained methodology. The Cahn-Hiliard-Cook linear stability analysis predicts some effects in the transitory regime. We also derive numerical algorithms for the computer simulation of these equations. The numerical results corroborate the analytical predictions of the linear analysis. We also present simulation results for spinodal decomposition at large times.
Resumo:
In this paper we design and develop several filtering strategies for the analysis of data generated by a resonant bar gravitational wave (GW) antenna, with the goal of assessing the presence (or absence) therein of long-duration monochromatic GW signals, as well as the eventual amplitude and frequency of the signals, within the sensitivity band of the detector. Such signals are most likely generated in the fast rotation of slightly asymmetric spinning stars. We develop practical procedures, together with a study of their statistical properties, which will provide us with useful information on the performance of each technique. The selection of candidate events will then be established according to threshold-crossing probabilities, based on the Neyman-Pearson criterion. In particular, it will be shown that our approach, based on phase estimation, presents a better signal-to-noise ratio than does pure spectral analysis, the most common approach.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.