876 resultados para Soft real-time distributed systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microsatelliti e nanosatelliti, come ad esempio i Cubesat, sono carenti di sistemi integrati di controllo d’assetto e di manovra orbitale. Lo scopo di questa tesi è stato quello di realizzare un sistema compatibile con Cubesat di una unità, completo di attuatori magnetici e attuatori meccanici, comprendente tutti i sensori e l’elettronica necessaria per il suo funzionamento, creando un dispositivo totalmente indipendente dal veicolo su cui è installato, capace di funzionare sia autonomamente che ricevendo comandi da terra. Nella tesi sono descritte le campagne di simulazioni numeriche effettuate per validare le scelte tecnologiche effettuate, le fasi di sviluppo dell’elettronica e della meccanica, i test sui prototipi realizzati e il funzionamento del sistema finale. Una integrazione così estrema dei componenti può implicare delle interferenze tra un dispositivo e l’altro, come nel caso dei magnetotorquer e dei magnetometri. Sono stati quindi studiati e valutati gli effetti della loro interazione, verificandone l’entità e la validità del progetto. Poiché i componenti utilizzati sono tutti di basso costo e di derivazione terrestre, è stata effettuata una breve introduzione teorica agli effetti dell’ambiente spaziale sull’elettronica, per poi descrivere un sistema fault-tolerant basato su nuove teorie costruttive. Questo sistema è stato realizzato e testato, verificando così la possibilità di realizzare un controller affidabile e resistente all’ambiente spaziale per il sistema di controllo d’assetto. Sono state infine analizzate alcune possibili versioni avanzate del sistema, delineandone i principali aspetti progettuali, come ad esempio l’integrazione di GPS e l’implementazione di funzioni di determinazione d’assetto sfruttando i sensori presenti a bordo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ziel dieser Dissertation ist die experimentelle Charakterisierung und quantitative Beschreibung der Hybridisierung von komplementären Nukleinsäuresträngen mit oberflächengebundenen Fängermolekülen für die Entwicklung von integrierten Biosensoren. Im Gegensatz zu lösungsbasierten Verfahren ist mit Microarray Substraten die Untersuchung vieler Nukleinsäurekombinationen parallel möglich. Als biologisch relevantes Evaluierungssystem wurde das in Eukaryoten universell exprimierte Actin Gen aus unterschiedlichen Pflanzenspezies verwendet. Dieses Testsystem ermöglicht es, nahe verwandte Pflanzenarten auf Grund von geringen Unterschieden in der Gen-Sequenz (SNPs) zu charakterisieren. Aufbauend auf dieses gut studierte Modell eines House-Keeping Genes wurde ein umfassendes Microarray System, bestehend aus kurzen und langen Oligonukleotiden (mit eingebauten LNA-Molekülen), cDNAs sowie DNA und RNA Targets realisiert. Damit konnte ein für online Messung optimiertes Testsystem mit hohen Signalstärken entwickelt werden. Basierend auf den Ergebnissen wurde der gesamte Signalpfad von Nukleinsärekonzentration bis zum digitalen Wert modelliert. Die aus der Entwicklung und den Experimenten gewonnen Erkenntnisse über die Kinetik und Thermodynamik von Hybridisierung sind in drei Publikationen zusammengefasst die das Rückgrat dieser Dissertation bilden. Die erste Publikation beschreibt die Verbesserung der Reproduzierbarkeit und Spezifizität von Microarray Ergebnissen durch online Messung von Kinetik und Thermodynamik gegenüber endpunktbasierten Messungen mit Standard Microarrays. Für die Auswertung der riesigen Datenmengen wurden zwei Algorithmen entwickelt, eine reaktionskinetische Modellierung der Isothermen und ein auf der Fermi-Dirac Statistik beruhende Beschreibung des Schmelzüberganges. Diese Algorithmen werden in der zweiten Publikation beschrieben. Durch die Realisierung von gleichen Sequenzen in den chemisch unterschiedlichen Nukleinsäuren (DNA, RNA und LNA) ist es möglich, definierte Unterschiede in der Konformation des Riboserings und der C5-Methylgruppe der Pyrimidine zu untersuchen. Die kompetitive Wechselwirkung dieser unterschiedlichen Nukleinsäuren gleicher Sequenz und die Auswirkungen auf Kinetik und Thermodynamik ist das Thema der dritten Publikation. Neben der molekularbiologischen und technologischen Entwicklung im Bereich der Sensorik von Hybridisierungsreaktionen oberflächengebundener Nukleinsäuremolekülen, der automatisierten Auswertung und Modellierung der anfallenden Datenmengen und der damit verbundenen besseren quantitativen Beschreibung von Kinetik und Thermodynamik dieser Reaktionen tragen die Ergebnisse zum besseren Verständnis der physikalisch-chemischen Struktur des elementarsten biologischen Moleküls und seiner nach wie vor nicht vollständig verstandenen Spezifizität bei.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores system performance for reconfigurable distributed systems and provides an analytical model for determining throughput of theoretical systems based on the OpenSPARC FPGA Board and the SIRC Communication Framework. This model was developed by studying a small set of variables that together determine a system¿s throughput. The importance of this model is in assisting system designers to make decisions as to whether or not to commit to designing a reconfigurable distributed system based on the estimated performance and hardware costs. Because custom hardware design and distributed system design are both time consuming and costly, it is important for designers to make decisions regarding system feasibility early in the development cycle. Based on experimental data the model presented in this paper shows a close fit with less than 10% experimental error on average. The model is limited to a certain range of problems, but it can still be used given those limitations and also provides a foundation for further development of modeling reconfigurable distributed systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reflected at any level of organization of the central nervous system, most of the processes ranging from ion channels to neuronal networks occur in a closed loop, where the input to the system depends on its output. In contrast, most in vitro preparations and experimental protocols operate autonomously, and do not depend on the output of the studied system. Thanks to the progress in digital signal processing and real-time computing, it is now possible to artificially close the loop and investigate biophysical processes and mechanisms under increased realism. In this contribution, we review some of the most relevant examples of a new trend in in vitro electrophysiology, ranging from the use of dynamic-clamp to multi-electrode distributed feedback stimulation. We are convinced these represents the beginning of new frontiers for the in vitro investigation of the brain, promising to open the still existing borders between theoretical and experimental approaches while taking advantage of cutting edge technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of service development based on GSM handset signaling. The aim is to achieve this goal without the participation of the users, which requires the use of a passive GSM receiver on the uplink. Since no tool for GSM uplink capturing was available, we developed a new method that can synchronize to multiple mobile devices by simply overhearing traffic between them and the network. Our work includes the implementation of modules for signal recovery, message reconstruction and parsing. The method has been validated against a benchmark solution on GSM downlink and independently evaluated on uplink channels. Initial evaluations show up to 99% success rate in message decoding, which is a very promising result. Moreover, we conducted measurements that reveal insights on the impact of signal power on the capturing performance and investigate possible reactive measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal screening masses related to the conserved vector current are determined for the case that the current carries a non-zero Matsubara frequency, both in a weak-coupling approach and through lattice QCD. We point out that such screening masses are sensitive to the same infrared physics as light-cone real-time rates. In particular, on the perturbative side, the inhomogeneous Schrödinger equation determining screening correlators is shown to have the same general form as the equation implementing LPM resummation for the soft-dilepton and photon production rates from a hot QCD plasma. The static potential appearing in the equation is identical to that whose soft part has been determined up to NLO and on the lattice in the context of jet quenching. Numerical results based on this potential suggest that screening masses overshoot the free results (multiples of 2πT) more strongly than at zero Matsubara frequency. Four-dimensional lattice simulations in two-flavour QCD at temperatures of 250 and 340 MeV confirm the non-static screening masses at the 10% level. Overall our results lend support to studies of jet quenching based on the same potential at T ≳ 250 MeV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Serial quantification of BCR-ABL1 mRNA is an important therapeutic indicator in chronic myeloid leukaemia, but there is a substantial variation in results reported by different laboratories. To improve comparability, an internationally accepted plasmid certified reference material (CRM) was developed according to ISO Guide 34:2009. Fragments of BCR-ABL1 (e14a2 mRNA fusion), BCR and GUSB transcripts were amplified and cloned into pUC18 to yield plasmid pIRMM0099. Six different linearised plasmid solutions were produced with the following copy number concentrations, assigned by digital PCR, and expanded uncertainties: 1.08±0.13 × 10(6), 1.08±0.11 × 10(5), 1.03±0.10 × 10(4), 1.02±0.09 × 10(3), 1.04±0.10 × 10(2) and 10.0±1.5 copies/μl. The certification of the material for the number of specific DNA fragments per plasmid, copy number concentration of the plasmid solutions and the assessment of inter-unit heterogeneity and stability were performed according to ISO Guide 35:2006. Two suitability studies performed by 63 BCR-ABL1 testing laboratories demonstrated that this set of 6 plasmid CRMs can help to standardise a number of measured transcripts of e14a2 BCR-ABL1 and three control genes (ABL1, BCR and GUSB). The set of six plasmid CRMs is distributed worldwide by the Institute for Reference Materials and Measurements (Belgium) and its authorised distributors (https://ec.europa.eu/jrc/en/reference-materials/catalogue/; CRM code ERM-AD623a-f).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear regression is a technique widely used in digital signal processing. It consists on finding the linear function that better fits a given set of samples. This paper proposes different hardware architectures for the implementation of the linear regression method on FPGAs, specially targeting area restrictive systems. It saves area at the cost of constraining the lengths of the input signal to some fixed values. We have implemented the proposed scheme in an Automatic Modulation Classifier, meeting the hard real-time constraints this kind of systems have.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real time Tritium concentrations in air coming from an ITER-like reactor as source were coupled the European Centre Medium Range Weather Forecast (ECMWF) numerical model with the lagrangian atmospheric dispersion model FLEXPART. This tool ECMWF/FLEXPART was analyzed in normal operating conditions in the Western Mediterranean Basin during 45 days at summer 2010. From comparison with NORMTRI plumes over Western Mediterranean Basin the real time results have demonstrated an overestimation of the corresponding climatologically sequence Tritium concentrations in air outputs, at several distances from the reactor. For these purpose two clouds development patterns were established. The first one was following a cyclonic circulation over the Mediterranean Sea and the second one was based in the cloud delivered over the Interior of the Iberian Peninsula by another stabilized circulation corresponding to a High. One of the important remaining activities defined then, was the tool qualification. The aim of this paper is to present the ECMWF/FLEXPART products confronted with Tritium concentration in air data. For this purpose a database to develop and validate ECMWF/FLEXPART tritium in both assessments has been selected from a NORMTRI run. Similarities and differences, underestimation and overestimation with NORMTRI will allowfor refinement in some features of ECMWF/FLEXPART

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a real-time tracking strategy based on direct methods for tracking tasks on-board UAVs, that is able to overcome problems posed by the challenging conditions of the task: e.g. constant vibrations, fast 3D changes, and limited capacity on-board. The vast majority of approaches make use of feature-based methods to track objects. Nonetheless, in this paper we show that although some of these feature-based solutions are faster, direct methods can be more robust under fast 3D motions (fast changes in position), some changes in appearance, constant vibrations (without requiring any specific hardware or software for video stabilization), and situations where part of the object to track is out the field of view of the camera. The performance of the proposed strategy is evaluated with images from real-flight tests using different evaluation mechanisms (e.g. accurate position estimation using a Vicon sytem). Results show that our tracking strategy performs better than well known feature-based algorithms and well known configurations of direct methods, and that the recovered data is robust enough for vision-in-the-loop tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtualization techniques have received increased attention in the field of embedded real-time systems. Such techniques provide a set of virtual machines that run on a single hardware platform, thus allowing several application programs to be executed as though they were running on separate machines, with isolated memory spaces and a fraction of the real processor time available to each of them.This papers deals with some problems that arise when implementing real-time systems written in Ada on a virtual machine. The effects of virtualization on the performance of the Ada real-time services are analysed, and requirements for the virtualization layer are derived. Virtual-machine time services are also defined in order to properly support Ada real-time applications. The implementation of the ORK+ kernel on the XtratuM supervisor is used as an example.