841 resultados para implementation of organizational values


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among the scientific objectives addressed by the Radio Science Experiment hosted on board the ESA mission BepiColombo is the retrieval of the rotational state of planet Mercury. In fact, the estimation of the obliquity and the librations amplitude were proven to be fundamental for constraining the interior composition of Mercury. This is accomplished by the Mercury Orbiter Radio science Experiment (MORE) via a strict interaction among different payloads thus making the experiment particularly challenging. The underlying idea consists in capturing images of the same landmark on the surface of the planet in different epochs in order to observe a displacement of the identified features with respect to a nominal rotation which allows to estimate the rotational parameters. Observations must be planned accurately in order to obtain image pairs carrying the highest information content for the following estimation process. This is not a trivial task especially in light of the several dynamical constraints involved. Another delicate issue is represented by the pattern matching process between image pairs for which the lowest correlation errors are desired. The research activity was conducted in the frame of the MORE rotation experiment and addressed the design and implementation of an end-to-end simulator of the experiment with the final objective of establishing an optimal science planning of the observations. In the thesis, the implementation of the singular modules forming the simulator is illustrated along with the simulations performed. The results obtained from the preliminary release of the optimization algorithm are finally presented although the software implemented is only at a preliminary release and will be improved and refined in the future also taking into account the developments of the mission.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European External Action Service (EEAS or Service) is one of the most significant and most debated innovations introduced by the Lisbon Treaty. This analysis intends to explain the anomalous design of the EEAS in light of its function, which consists in the promotion of external action coherence. Coherence is a principle of the EU legal system, which requires synergy in the external actions of the Union and its Members. It can be enforced only through the coordination of European policy-makers' initiatives, by bridging the gap between the 'Communitarian' and intergovernmental approaches. This is the 'Union method' envisaged by A. Merkel: "coordinated action in a spirit of solidarity - each of us in the area for which we are responsible but all working towards the same goal". The EEAS embodies the 'Union method', since it is institutionally linked to both Union organs and Member States. It is also capable of enhancing synergy in policy management and promoting unity in international representation, since its field of action is delimited not by an abstract concern for institutional balance but by a pragmatic assessment of the need for coordination in each sector. The challenge is now to make sure that this pragmatic approach is applied with respect to all the activities of the Service, in order to reinforce its effectiveness. The coordination brought by the EEAS is in fact the only means through which a European foreign policy can come into being: the choice is not between the Community method and the intergovernmental method, but between a coordinated position and nothing at all.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network Theory is a prolific and lively field, especially when it approaches Biology. New concepts from this theory find application in areas where extensive datasets are already available for analysis, without the need to invest money to collect them. The only tools that are necessary to accomplish an analysis are easily accessible: a computing machine and a good algorithm. As these two tools progress, thanks to technology advancement and human efforts, wider and wider datasets can be analysed. The aim of this paper is twofold. Firstly, to provide an overview of one of these concepts, which originates at the meeting point between Network Theory and Statistical Mechanics: the entropy of a network ensemble. This quantity has been described from different angles in the literature. Our approach tries to be a synthesis of the different points of view. The second part of the work is devoted to presenting a parallel algorithm that can evaluate this quantity over an extensive dataset. Eventually, the algorithm will also be used to analyse high-throughput data coming from biology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conventional way to calculate hard scattering processes in perturbation theory using Feynman diagrams is not efficient enough to calculate all necessary processes - for example for the Large Hadron Collider - to a sufficient precision. Two alternatives to order-by-order calculations are studied in this thesis.rnrnIn the first part we compare the numerical implementations of four different recursive methods for the efficient computation of Born gluon amplitudes: Berends-Giele recurrence relations and recursive calculations with scalar diagrams, with maximal helicity violating vertices and with shifted momenta. From the four methods considered, the Berends-Giele method performs best, if the number of external partons is eight or bigger. However, for less than eight external partons, the recursion relation with shifted momenta offers the best performance. When investigating the numerical stability and accuracy, we found that all methods give satisfactory results.rnrnIn the second part of this thesis we present an implementation of a parton shower algorithm based on the dipole formalism. The formalism treats initial- and final-state partons on the same footing. The shower algorithm can be used for hadron colliders and electron-positron colliders. Also massive partons in the final state were included in the shower algorithm. Finally, we studied numerical results for an electron-positron collider, the Tevatron and the Large Hadron Collider.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a CMOS Amplifier with High Common Mode rejection designed in UMC 130nm technology. The goal is to achieve a high amplification factor for a wide range of biological signals (with frequencies in the range of 10Hz-1KHz) and to reject the common-mode noise signal. It is here presented a Data Acquisition System, composed of a Delta-Sigma-like Modulator and an antenna, that is the core of a portable low-complexity radio system; the amplifier is designed in order to interface the data acquisition system with a sensor that acquires the electrical signal. The Modulator asynchronously acquires and samples human muscle activity, by sending a Quasi-Digital pattern that encodes the acquired signal. There is only a minor loss of information translating the muscle activity using this pattern, compared to an encoding technique which uses astandard digital signal via Impulse-Radio Ultra-Wide Band (IR-UWB). The biological signals, needed for Electromyographic analysis, have an amplitude of 10-100μV and need to be highly amplified and separated from the overwhelming 50mV common mode noise signal. Various tests of the firmness of the concept are presented, as well the proof that the design works even with different sensors, such as Radiation measurement for Dosimetry studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relazione del lavoro di creazione e implementazione della piattaforma software che sviluppa l’archivio del progetto SATNET. I satelliti universitari hanno un tempo di vista della propria Stazione di Terra di pochi minuti al giorno: SATNET risponde all’esigenza di comunicare con un satellite universitario in orbita bassa per più dei pochi minuti al giorno che una singola Stazione di Terra permette. Questo avviene grazie a una rete di Stazioni di Terra Satellitari collegate da specifiche missioni comuni che mettono in condivisione dati ricevuti da uno o più satelliti, aumentando il rendimento dati/giorno di questi e permettendo una migliore fruizione delle Stazioni di Terra stesse. Il network sfrutta Internet come canale di connessione, e prevede la presenza di un archivio nel quale memorizzare i dati ricevuti, per poi renderne possibile la consultazione e il recupero. Oggetto di questo lavoro di tesi è stato lo sviluppo e l’implementazione di tale archivio: utilizzando un sito web dinamico, il software risponde a tutte le richieste evidenziate nel paragrafo precedente, permettendo a utenti autenticati di inserire dati e ad altri di poterne avere accesso. Il software è completo e funzionante ma non finito, in quanto manca la formulazione di alcune richieste; per esempio non è stato specificato il tipo di informazioni che è possibile caricare in upload, né il tipo di campi richiesti nel modulo di registrazione dei vari utenti. In questi casi sono stati inseriti campi generici, lasciando all’utente la possibilità di modificarli in seguito. Il software è stato dunque concepito come facilmente personalizzabile e modificabile anche da utenti inesperti grazie alla sola lettura della tesi, che rappresenta quindi una vera e propria guida per l’utilizzo, l’installazione, la personalizzazione e la manutenzione della piattaforma software. La tesi evidenzia gli obiettivi e le richieste, mostrando l’aspetto del sito web e le sue funzionalità, e spiega passo per passo il procedimento per la modifica dell’aspetto delle pagine e di alcuni parametri di configurazione. Inoltre, qualora siano necessarie modifiche sostanziali al progetto, introduce i vari linguaggi di programmazione necessari allo sviluppo e alla programmazione web e aiuta l’utente nella comprensione della struttura del software. Si conclude con alcuni suggerimenti su eventuali modifiche, attuabili solo a seguito di un lavoro di definizione degli obiettivi e delle specifiche richieste. In futuro ci si aspetta l’implementazione e la personalizzazione del software, nonché l’integrazione dell’archivio all’interno del progetto SATNET, con l’obiettivo di migliorare e favorire la diffusione e la condivisione di progetti comuni tra diverse Università Europee ed Extra-Europee.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EBPR (Enhanced Biological Phosphorus Removal) is a type of secondary treatment in WWTPs (WasteWater Treatment Plants), quite largely used in full-scale plants worldwide. The phosphorus occurring in aquatic systems in high amounts can cause eutrophication and consequently the death of fauna and flora. A specific biomass is used in order to remove the phosphorus, the so-called PAOs (Polyphosphate Accumulating Organisms) that accumulate the phosphorus in form of polyphosphate in their cells. Some of these organisms, the so-called DPAO (Denitrifying Polyphosphate Accumulating Organisms) use as electron acceptor the nitrate or nitrite, contributing in this way also to the removal of these compounds from the wastewater, but there could be side reactions leading to the formation of nitrous oxides. The aim of this project was to simulate in laboratory scale a EBPR, acclimatizing and enriching the specialized biomass. Two bioreactors were operated as Sequencing Batch Reactors, one enriched in Accumulibacter, the other in Tetrasphaera (both PAOs): Tetrasphaera microorganisms are able to uptake aminoacids as carbon source, Accumulibacter uptake organic carbon (volatile fatty acids, VFA). In order to measure the removal of COD, phosphorus and nitrogen-derivate compounds, different analysis were performed: spectrophotometric measure of phosphorus, nitrate, nitrite and ammonia concentrations, TOC (Total Organic Carbon, measuring the carbon consumption), VFA via HPLC (High Performance Liquid Chromatography), total and volatile suspended solids following standard methods APHA, qualitative microorganism population via FISH (Fluorescence In Situ Hybridization). Batch test were also performed to monitor the NOx production. Both specialized populations accumulated as a result of SBR operations; however, Accumulibacter were found to uptake phosphates at higher extents. Both populations were able to remove efficiently nitrates and organic compounds occurring in the feeding. The experimental work was carried out at FCT of Universidade Nova de Lisboa (FCT-UNL) from February to July 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud services are becoming ever more important for everyone's life. Cloud storage? Web mails? Yes, we don't need to be working in big IT companies to be surrounded by cloud services. Another thing that's growing in importance, or at least that should be considered ever more important, is the concept of privacy. The more we rely on services of which we know close to nothing about, the more we should be worried about our privacy. In this work, I will analyze a prototype software based on a peer to peer architecture for the offering of cloud services, to see if it's possible to make it completely anonymous, meaning that not only the users using it will be anonymous, but also the Peers composing it will not know the real identity of each others. To make it possible, I will make use of anonymizing networks like Tor. I will start by studying the state of art of Cloud Computing, by looking at some real example, followed by analyzing the architecture of the prototype, trying to expose the differences between its distributed nature and the somehow centralized solutions offered by the famous vendors. After that, I will get as deep as possible into the working principle of the anonymizing networks, because they are not something that can just be 'applied' mindlessly. Some de-anonymizing techniques are very subtle so things must be studied carefully. I will then implement the required changes, and test the new anonymized prototype to see how its performances differ from those of the standard one. The prototype will be run on many machines, orchestrated by a tester script that will automatically start, stop and do all the required API calls. As to where to find all these machines, I will make use of Amazon EC2 cloud services and their on-demand instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis work is developed under the European Student Earth Orbiter (ESEO) project supported by the European Space Agency (ESA) in order to help prepare a well-qualified space-engineering workforce for Europe's future. In the following chapters we are going to analyse how to simulate some ESEO subsystem. First of all, the Thermal Subsystem that evaluates the temperature evolution of on-board instruments. For this purpose, simulating also the orbital and attitude dynamics of the spacecraft, it is necessary in order to evaluate external environmental fluxes. The Power Subsystem will be the following step and it models the ability of a spacecraft to produce and store electrical energy. Finally, we will integrate in our software a block capable of simulating the communication link between the satellite and the Ground Station (GS). This last step is designed and validated during the thesis preparation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Switzerland introduces a DRG (Diagnosis Related Groups) based system for hospital financing in 2012 in order to increase efficiency and transparency of Swiss health care. DRG-based hospital reimbursement is not simultaneously realized in all Swiss cantons and several cantons already implemented DRG-based financing irrespective of the national agenda, a setting that provides an opportunity to compare the situation in different cantons. Effects of introducing DRGs anticipated for providers and insurers are relatively well known but it remains less clear what effects DRGs will have on served populations. The objective of the study is therefore to analyze differences of volume and major quality indicators of care between areas with or without DRG-based hospital reimbursement from a population based perspective. Methods Small area analysis of all hospitalizations in acute care hospitals and of all consultations reimbursed by mandatory basic health insurance for physicians in own practice during 2003-2007. Results The results show fewer hospitalizations and a relocation of resources to outpatient care in areas with DRG reimbursement. Overall burden of disease expressed as per capita DRG cost weights was almost identical between the two types of hospital reimbursement and no distinct temporal differences were detected in this respect. But the results show considerably higher 90-day rehospitalization rates in DRG areas. Conclusion The study provides evidence of both desired and harmful effects related to the implementation of DRGs. Systematic monitoring of outcomes and quality of care are therefore essential elements to maintain in the Swiss health system after DRG's are implemented on a nationwide basis in 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Raltegravir (RAL) achieved remarkable virologic suppression rates in randomized-clinical trials, but today efficacy data and factors for treatment failures in a routine clinical care setting are limited.