886 resultados para Cloud Fraction
Resumo:
L'obbiettivo che ci poniamo con questa tesi è quello di esplorare il mondo del Cloud Computing, cercando di capire le principali caratteristiche architetturali e vedere in seguito i componenti fondamentali che si occupano di trasformare una infrastruttura informatica in un'infrastruttura cloud, ovvero i Cloud Operating System.
Resumo:
Sviluppo applicazione per condividere contatti da SugarCRM con Google Contacts
Resumo:
In this study the Aerodyne Aerosol Mass Spectrometer (AMS) was used during three laboratory measurement campaigns, FROST1, FROST2 and ACI-03. The FROST campaigns took place at the Leipzig Aerosol Cloud Interaction Simulator (LACIS) at the IfT in Leipzig and the ACI-03 campaign was conducted at the AIDA facility at the Karlsruhe Institute of Technology (KIT). In all three campaigns, the effect of coatings on mineral dust ice nuclei (IN) was investigated. During the FROST campaigns, Arizona Test Dust (ATD) particles of 200, 300 and 400 nm diameter were coated with thin coatings (< 7 nm) of sulphuric acid. At these very thin coatings, the AMS was operated close to its detection limits. Up to now it was not possible to accurately determine AMS detection limits during regular measurements. Therefore, the mathematical tools to analyse the detection limits of the AMS have been improved in this work. It is now possible to calculate detection limits of the AMS under operating conditions, without losing precious time by sampling through a particle filter. The instrument was characterised in more detail to enable correct quantification of the sulphate loadings on the ATD particle surfaces. Correction factors for the instrument inlet transmission, the collection efficiency, and the relative ionisation efficiency have been determined. With these corrections it was possible to quantify the sulphate mass per particle on the ATD after the condensation of sulphuric acid on its surface. The AMS results have been combined with the ice nucleus counter results. This revealed that the IN-efficiency of ATD is reduced when it is coated with sulphuric acid. The reason for this reduction is a chemical reaction of sulphuric acid with the particle's surface. These reactions are increasingly taking place when the aerosol is humidified or heated after the coating with sulphuric acid. A detailed analysis of the solubility and the evaporation temperature of the surface reaction products revealed that most likely aluminium sulphate is produced in these reactions.
Resumo:
Il cloud permette di condividere in maniera veloce ed intuitiva non solo le informazioni, ma anche dati e programmi aumentando notevolmente le capacità di collaborazione di tutti, partendo da chi lo utilizza a livello personale per arrivare a chi lo utilizza a livello professionale. È possibile creare le proprie applicazioni aziendali e farle comunicare remotamente con le soluzioni cloud presenti sul mercato, attraverso gli strumenti messi a disposizione dai fornitori. Le applicazioni stesse potranno essere pubblicate su server anch’essi cloud, gestiti interamente da aziende esterne, che permettono una forma di pagamento in base all’effettivo utilizzo e libera dalle implicazioni della gestione di un server. In questo scenario verrà svolta una ricerca sulle caratteristiche di alcuni servizi cloud esistenti, in particolare riguardo ai software di gestione dei documenti, e verrà creata un’applicazione che comunica con essi sfruttandone le potenzialità.
Resumo:
This PhD thesis discusses the impact of Cloud Computing infrastructures on Digital Forensics in the twofold role of target of investigations and as a helping hand to investigators. The Cloud offers a cheap and almost limitless computing power and storage space for data which can be leveraged to commit either new or old crimes and host related traces. Conversely, the Cloud can help forensic examiners to find clues better and earlier than traditional analysis applications, thanks to its dramatically improved evidence processing capabilities. In both cases, a new arsenal of software tools needs to be made available. The development of this novel weaponry and its technical and legal implications from the point of view of repeatability of technical assessments is discussed throughout the following pages and constitutes the unprecedented contribution of this work
Resumo:
Redshift Space Distortions (RSD) are an apparent anisotropy in the distribution of galaxies due to their peculiar motion. These features are imprinted in the correlation function of galaxies, which describes how these structures distribute around each other. RSD can be represented by a distortions parameter $\beta$, which is strictly related to the growth of cosmic structures. For this reason, measurements of RSD can be exploited to give constraints on the cosmological parameters, such us for example the neutrino mass. Neutrinos are neutral subatomic particles that come with three flavours, the electron, the muon and the tau neutrino. Their mass differences can be measured in the oscillation experiments. Information on the absolute scale of neutrino mass can come from cosmology, since neutrinos leave a characteristic imprint on the large scale structure of the universe. The aim of this thesis is to provide constraints on the accuracy with which neutrino mass can be estimated when expoiting measurements of RSD. In particular we want to describe how the error on the neutrino mass estimate depends on three fundamental parameters of a galaxy redshift survey: the density of the catalogue, the bias of the sample considered and the volume observed. In doing this we make use of the BASICC Simulation from which we extract a series of dark matter halo catalogues, characterized by different value of bias, density and volume. This mock data are analysed via a Markov Chain Monte Carlo procedure, in order to estimate the neutrino mass fraction, using the software package CosmoMC, which has been conveniently modified. In this way we are able to extract a fitting formula describing our measurements, which can be used to forecast the precision reachable in future surveys like Euclid, using this kind of observations.
Resumo:
The Large Magellanic Cloud (LMC) is widely considered as the first step of the cosmological distance ladder, since it contains many different distance indicators. An accurate determination of the distance to the LMC allows one to calibrate these distance indicators that are then used to measure the distance to far objects. The main goal of this thesis is to study the distance and structure of the LMC, as traced by different distance indicators. For these purposes three types of distance indicators were chosen: Classical Cepheids,``hot'' eclipsing binaries and RR Lyrae stars. These objects belong to different stellar populations tracing, in turn, different sub-structures of the LMC. The RR Lyrae stars (age >10 Gyr) are distributed smoothly and likely trace the halo of the LMC. Classical Cepheids are young objects (age 50-200 Myr), mainly located in the bar and spiral arm of the galaxy, while ``hot'' eclipsing binaries mainly trace the star forming regions of the LMC. Furthermore, we have chosen these distance indicators for our study, since the calibration of their zero-points is based on fundamental geometric methods. The ESA cornerstone mission Gaia, launched on 19 December 2013, will measure trigonometric parallaxes for one billion stars with an accuracy of 20 micro-arcsec at V=15 mag, and 200 micro-arcsec at V=20 mag, thus will allow us to calibrate the zero-points of Classical Cepheids, eclipsing binaries and RR Lyrae stars with an unprecedented precision.
Resumo:
Sulfate aerosol plays an important but uncertain role in cloud formation and radiative forcing of the climate, and is also important for acid deposition and human health. The oxidation of SO2 to sulfate is a key reaction in determining the impact of sulfate in the environment through its effect on aerosol size distribution and composition. This thesis presents a laboratory investigation of sulfur isotope fractionation during SO2 oxidation by the most important gas-phase and heterogeneous pathways occurring in the atmosphere. The fractionation factors are then used to examine the role of sulfate formation in cloud processing of aerosol particles during the HCCT campaign in Thuringia, central Germany. The fractionation factor for the oxidation of SO2 by ·OH radicals was measured by reacting SO2 gas, with a known initial isotopic composition, with ·OH radicals generated from the photolysis of water at -25, 0, 19 and 40°C (Chapter 2). The product sulfate and the residual SO2 were collected as BaSO4 and the sulfur isotopic compositions measured with the Cameca NanoSIMS 50. The measured fractionation factor for 34S/32S during gas phase oxidation is αOH = (1.0089 ± 0.0007) − ((4 ± 5) × 10−5 )T (°C). Fractionation during oxidation by major aqueous pathways was measured by bubbling the SO2 gas through a solution of H2 O2
Resumo:
Uno dei temi più discussi ed interessanti nel mondo dell’informatica al giorno d’oggi è sicuramente il Cloud Computing. Nuove organizzazioni che offrono servizi di questo tipo stanno nascendo ovunque e molte aziende oggi desiderano imparare ad utilizzarli, migrando i loro centri di dati e le loro applicazioni nel Cloud. Ciò sta avvenendo anche grazie alla spinta sempre più forte che stanno imprimendo le grandi compagnie nella comunità informatica: Google, Amazon, Microsoft, Apple e tante altre ancora parlano sempre più frequentemente di Cloud Computing e si stanno a loro volta ristrutturando profondamente per poter offrire servizi Cloud adeguandosi così a questo grande cambiamento che sta avvenendo nel settore dell’informatica. Tuttavia il grande movimento di energie, capitali, investimenti ed interesse che l’avvento del Cloud Computing sta causando non aiuta a comprendere in realtà che cosa esso sia, al punto tale che oggi non ne esiste ancora una definizione univoca e condivisa. La grande pressione inoltre che esso subisce da parte del mondo del mercato fa sì che molte delle sue più peculiari caratteristiche, dal punto di vista dell’ingegneria del software, vengano nascoste e soverchiate da altre sue proprietà, architetturalmente meno importanti, ma con un più grande impatto sul pubblico di potenziali clienti. Lo scopo che mi propongo con questa tesi è quello quindi di cercare di fare chiarezza in quello che è il mondo del Cloud computing, focalizzandomi particolarmente su quelli che sono i design pattern più utilizzati nello sviluppo di applicazioni di tipo cloud e presentando quelle che oggi rappresentano le principali tecnologie che vengono utilizzate sia in ambito professionale, che in ambito di ricerca, per realizzare le applicazioni cloud, concentrandomi in maniera particolare su Microsoft Orleans.
Resumo:
Spectroscopy of the 1S-2S transition of antihydrogen confined in a neutral atom trap and comparison with the equivalent spectral line in hydrogen will provide an accurate test of CPT symmetry and the first one in a mixed baryon-lepton system. Also, with neutral antihydrogen atoms, the gravitational interaction between matter and antimatter can be tested unperturbed by the much stronger Coulomb forces.rnAntihydrogen is regularly produced at CERN's Antiproton Decelerator by three-body-recombination (TBR) of one antiproton and two positrons. The method requires injecting antiprotons into a cloud of positrons, which raises the average temperature of the antihydrogen atoms produced way above the typical 0.5 K trap depths of neutral atom traps. Therefore only very few antihydrogen atoms can be confined at a time. Precision measurements, like laser spectroscopy, will greatly benefit from larger numbers of simultaneously trapped antihydrogen atoms.rnTherefore, the ATRAP collaboration developed a different production method that has the potential to create much larger numbers of cold, trappable antihydrogen atoms. Positrons and antiprotons are stored and cooled in a Penning trap in close proximity. Laser excited cesium atoms collide with the positrons, forming Rydberg positronium, a bound state of an electron and a positron. The positronium atoms are no longer confined by the electric potentials of the Penning trap and some drift into the neighboring cloud of antiprotons where, in a second charge exchange collision, they form antihydrogen. The antiprotons remain at rest during the entire process, so much larger numbers of trappable antihydrogen atoms can be produced. Laser excitation is necessary to increase the efficiency of the process since the cross sections for charge-exchange collisions scale with the fourth power of the principal quantum number n.rnThis method, named double charge-exchange, was demonstrated by ATRAP in 2004. Since then, ATRAP constructed a new combined Penning Ioffe trap and a new laser system. The goal of this thesis was to implement the double charge-exchange method in this new apparatus and increase the number of antihydrogen atoms produced.rnCompared to our previous experiment, we could raise the numbers of positronium and antihydrogen atoms produced by two orders of magnitude. Most of this gain is due to the larger positron and antiproton plasmas available by now, but we could also achieve significant improvements in the efficiencies of the individual steps. We therefore showed that the double charge-exchange can produce comparable numbers of antihydrogen as the TBR method, but the fraction of cold, trappable atoms is expected to be much higher. Therefore this work is an important step towards precision measurements with trapped antihydrogen atoms.
Resumo:
Aerosol particles are important actors in the Earth’s atmosphere and climate system. They scatter and absorb sunlight, serve as nuclei for water droplets and ice crystals in clouds and precipitation, and are a subject of concern for public health. Atmospheric aerosols originate from both natural and anthropogenic sources, and emissions resulting from human activities have the potential to influence the hydrological cycle and climate. An assessment of the extent and impacts of this human force requires a sound understanding of the natural aerosol background. This dissertation addresses the composition, properties, and atmospheric cycling of biogenic aerosol particles, which represent a major fraction of the natural aerosol burden. The main focal points are: (i) Studies of the autofluo-rescence of primary biological aerosol particles (PBAP) and its application in ambient measure-ments, and (ii) X-ray microscopic and spectroscopic investigations of biogenic secondary organic aerosols (SOA) from the Amazonian rainforest.rnAutofluorescence of biological material has received increasing attention in atmospheric science because it allows real-time monitoring of PBAP in ambient air, however it is associated with high uncertainty. This work aims at reducing the uncertainty through a comprehensive characterization of the autofluorescence properties of relevant biological materials. Fluorescence spectroscopy and microscopy were applied to analyze the fluorescence signatures of pure biological fluorophores, potential non-biological interferences, and various types of reference PBAP. Characteristic features and fingerprint patterns were found and provide support for the operation, interpretation, and further development of PBAP autofluorescence measurements. Online fluorescence detection and offline fluorescence microscopy were jointly applied in a comprehensive bioaerosol field measurement campaign that provided unprecedented insights into PBAP-linked biosphere-atmosphere interactions in a North-American semi-arid forest environment. Rain showers were found to trigger massive bursts of PBAP, including high concentrations of biological ice nucleators that may promote further precipitation and can be regarded as part of a bioprecipitation feedback cycle in the climate system. rnIn the pristine tropical rainforest air of the Amazon, most cloud and fog droplets form on bio-genic SOA particles, but the composition, morphology, mixing state and origin of these particles is hardly known. X-ray microscopy and spectroscopy (STXM-NEXAFS) revealed distinctly different types of secondary organic matter (carboxyl- vs. hydroxy-rich) with internal structures that indicate a strong influence of phase segregation, cloud and fog processing on SOA formation, and aging. In addition, nanometer-sized potassium-rich particles emitted by microorganisms and vegetation were found to act as seeds for the condensation of SOA. Thus, the influence of forest biota on the atmospheric abundance of cloud condensation nuclei appears to be more direct than previously assumed. Overall, the results of this dissertation suggest that biogenic aerosols, clouds and precipitation are indeed tightly coupled through a bioprecipitation cycle, and that advanced microscopic and spectroscopic techniques can provide detailed insights into these mechanisms.rn
Resumo:
La natura distribuita del Cloud Computing, che comporta un'elevata condivisione delle risorse e una moltitudine di accessi ai sistemi informatici, permette agli intrusi di sfruttare questa tecnologia a scopi malevoli. Per contrastare le intrusioni e gli attacchi ai dati sensibili degli utenti, vengono implementati sistemi di rilevamento delle intrusioni e metodi di difesa in ambiente virtualizzato, allo scopo di garantire una sicurezza globale fondata sia sul concetto di prevenzione, sia su quello di cura: un efficace sistema di sicurezza deve infatti rilevare eventuali intrusioni e pericoli imminenti, fornendo una prima fase difensiva a priori, e, al contempo, evitare fallimenti totali, pur avendo subito danni, e mantenere alta la qualità del servizio, garantendo una seconda fase difensiva, a posteriori. Questa tesi illustra i molteplici metodi di funzionamento degli attacchi distribuiti e dell'hacking malevolo, con particolare riferimento ai pericoli di ultima generazione, e definisce le principali strategie e tecniche atte a garantire sicurezza, protezione e integrità dei dati all'interno di un sistema Cloud.
Resumo:
Nella fisica delle particelle, onde poter effettuare analisi dati, è necessario disporre di una grande capacità di calcolo e di storage. LHC Computing Grid è una infrastruttura di calcolo su scala globale e al tempo stesso un insieme di servizi, sviluppati da una grande comunità di fisici e informatici, distribuita in centri di calcolo sparsi in tutto il mondo. Questa infrastruttura ha dimostrato il suo valore per quanto riguarda l'analisi dei dati raccolti durante il Run-1 di LHC, svolgendo un ruolo fondamentale nella scoperta del bosone di Higgs. Oggi il Cloud computing sta emergendo come un nuovo paradigma di calcolo per accedere a grandi quantità di risorse condivise da numerose comunità scientifiche. Date le specifiche tecniche necessarie per il Run-2 (e successivi) di LHC, la comunità scientifica è interessata a contribuire allo sviluppo di tecnologie Cloud e verificare se queste possano fornire un approccio complementare, oppure anche costituire una valida alternativa, alle soluzioni tecnologiche esistenti. Lo scopo di questa tesi è di testare un'infrastruttura Cloud e confrontare le sue prestazioni alla LHC Computing Grid. Il Capitolo 1 contiene un resoconto generale del Modello Standard. Nel Capitolo 2 si descrive l'acceleratore LHC e gli esperimenti che operano a tale acceleratore, con particolare attenzione all’esperimento CMS. Nel Capitolo 3 viene trattato il Computing nella fisica delle alte energie e vengono esaminati i paradigmi Grid e Cloud. Il Capitolo 4, ultimo del presente elaborato, riporta i risultati del mio lavoro inerente l'analisi comparata delle prestazioni di Grid e Cloud.
Resumo:
Il Data Distribution Management (DDM) è un componente dello standard High Level Architecture. Il suo compito è quello di rilevare le sovrapposizioni tra update e subscription extent in modo efficiente. All'interno di questa tesi si discute la necessità di avere un framework e per quali motivi è stato implementato. Il testing di algoritmi per un confronto equo, librerie per facilitare la realizzazione di algoritmi, automatizzazione della fase di compilazione, sono motivi che sono stati fondamentali per iniziare la realizzazione framework. Il motivo portante è stato che esplorando articoli scientifici sul DDM e sui vari algoritmi si è notato che in ogni articolo si creavano dei dati appositi per fare dei test. L'obiettivo di questo framework è anche quello di riuscire a confrontare gli algoritmi con un insieme di dati coerente. Si è deciso di testare il framework sul Cloud per avere un confronto più affidabile tra esecuzioni di utenti diversi. Si sono presi in considerazione due dei servizi più utilizzati: Amazon AWS EC2 e Google App Engine. Sono stati mostrati i vantaggi e gli svantaggi dell'uno e dell'altro e il motivo per cui si è scelto di utilizzare Google App Engine. Si sono sviluppati quattro algoritmi: Brute Force, Binary Partition, Improved Sort, Interval Tree Matching. Sono stati svolti dei test sul tempo di esecuzione e sulla memoria di picco utilizzata. Dai risultati si evince che l'Interval Tree Matching e l'Improved Sort sono i più efficienti. Tutti i test sono stati svolti sulle versioni sequenziali degli algoritmi e che quindi ci può essere un riduzione nel tempo di esecuzione per l'algoritmo Interval Tree Matching.
Resumo:
Atmosphärische Partikel beeinflussen das Klima durch Prozesse wie Streuung, Reflexion und Absorption. Zusätzlich fungiert ein Teil der Aerosolpartikel als Wolkenkondensationskeime (CCN), die sich auf die optischen Eigenschaften sowie die Rückstreukraft der Wolken und folglich den Strahlungshaushalt auswirken. Ob ein Aerosolpartikel Eigenschaften eines Wolkenkondensationskeims aufweist, ist vor allem von der Partikelgröße sowie der chemischen Zusammensetzung abhängig. Daher wurde die Methode der Einzelpartikel-Laserablations-Massenspektrometrie angewandt, die eine größenaufgelöste chemische Analyse von Einzelpartikeln erlaubt und zum Verständnis der ablaufenden multiphasenchemischen Prozesse innerhalb der Wolke beitragen soll.rnIm Rahmen dieser Arbeit wurde zur Charakterisierung von atmosphärischem Aerosol sowie von Wolkenresidualpartikel das Einzelpartikel-Massenspektrometer ALABAMA (Aircraft-based Laser Ablation Aerosol Mass Spectrometer) verwendet. Zusätzlich wurde zur Analyse der Partikelgröße sowie der Anzahlkonzentration ein optischer Partikelzähler betrieben. rnZur Bestimmung einer geeigneten Auswertemethode, die die Einzelpartikelmassenspektren automatisch in Gruppen ähnlich aussehender Spektren sortieren soll, wurden die beiden Algorithmen k-means und fuzzy c-means auf ihrer Richtigkeit überprüft. Es stellte sich heraus, dass beide Algorithmen keine fehlerfreien Ergebnisse lieferten, was u.a. von den Startbedingungen abhängig ist. Der fuzzy c-means lieferte jedoch zuverlässigere Ergebnisse. Darüber hinaus wurden die Massenspektren anhand auftretender charakteristischer chemischer Merkmale (Nitrat, Sulfat, Metalle) analysiert.rnIm Herbst 2010 fand die Feldkampagne HCCT (Hill Cap Cloud Thuringia) im Thüringer Wald statt, bei der die Veränderung von Aerosolpartikeln beim Passieren einer orographischen Wolke sowie ablaufende Prozesse innerhalb der Wolke untersucht wurden. Ein Vergleich der chemischen Zusammensetzung von Hintergrundaerosol und Wolkenresidualpartikeln zeigte, dass die relativen Anteile von Massenspektren der Partikeltypen Ruß und Amine für Wolkenresidualpartikel erhöht waren. Dies lässt sich durch eine gute CCN-Aktivität der intern gemischten Rußpartikel mit Nitrat und Sulfat bzw. auf einen begünstigten Übergang der Aminverbindungen aus der Gas- in die Partikelphase bei hohen relativen Luftfeuchten und tiefen Temperaturen erklären. Darüber hinaus stellte sich heraus, dass bereits mehr als 99% der Partikel des Hintergrundaerosols intern mit Nitrat und/oder Sulfat gemischt waren. Eine detaillierte Analyse des Mischungszustands der Aerosolpartikel zeigte, dass sich sowohl der Nitratgehalt als auch der Sulfatgehalt der Partikel beim Passieren der Wolke erhöhte. rn