9 resultados para HISTORICAL DATA-ANALYSIS

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this dissertation is to study the structure and behavior of the Atmospheric Boundary Layer (ABL) in stable conditions. This type of boundary layer is not completely well understood yet, although it is very important for many practical uses, from forecast modeling to atmospheric dispersion of pollutants. We analyzed data from the SABLES98 experiment (Stable Atmospheric Boundary Layer Experiment in Spain, 1998), and compared the behaviour of this data using Monin-Obukhov's similarity functions for wind speed and potential temperature. Analyzing the vertical profiles of various variables, in particular the thermal and momentum fluxes, we identified two main contrasting structures describing two different states of the SBL, a traditional and an upside-down boundary layer. We were able to determine the main features of these two states of the boundary layer in terms of vertical profiles of potential temperature and wind speed, turbulent kinetic energy and fluxes, studying the time series and vertical structure of the atmosphere for two separate nights in the dataset, taken as case studies. We also developed an original classification of the SBL, in order to separate the influence of mesoscale phenomena from turbulent behavior, using as parameters the wind speed and the gradient Richardson number. We then compared these two formulations, using the SABLES98 dataset, verifying their validity for different variables (wind speed and potential temperature, and their difference, at different heights) and with different stability parameters (zita or Rg). Despite these two classifications having completely different physical origins, we were able to find some common behavior, in particular under weak stability conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nella fisica delle particelle, onde poter effettuare analisi dati, è necessario disporre di una grande capacità di calcolo e di storage. LHC Computing Grid è una infrastruttura di calcolo su scala globale e al tempo stesso un insieme di servizi, sviluppati da una grande comunità di fisici e informatici, distribuita in centri di calcolo sparsi in tutto il mondo. Questa infrastruttura ha dimostrato il suo valore per quanto riguarda l'analisi dei dati raccolti durante il Run-1 di LHC, svolgendo un ruolo fondamentale nella scoperta del bosone di Higgs. Oggi il Cloud computing sta emergendo come un nuovo paradigma di calcolo per accedere a grandi quantità di risorse condivise da numerose comunità scientifiche. Date le specifiche tecniche necessarie per il Run-2 (e successivi) di LHC, la comunità scientifica è interessata a contribuire allo sviluppo di tecnologie Cloud e verificare se queste possano fornire un approccio complementare, oppure anche costituire una valida alternativa, alle soluzioni tecnologiche esistenti. Lo scopo di questa tesi è di testare un'infrastruttura Cloud e confrontare le sue prestazioni alla LHC Computing Grid. Il Capitolo 1 contiene un resoconto generale del Modello Standard. Nel Capitolo 2 si descrive l'acceleratore LHC e gli esperimenti che operano a tale acceleratore, con particolare attenzione all’esperimento CMS. Nel Capitolo 3 viene trattato il Computing nella fisica delle alte energie e vengono esaminati i paradigmi Grid e Cloud. Il Capitolo 4, ultimo del presente elaborato, riporta i risultati del mio lavoro inerente l'analisi comparata delle prestazioni di Grid e Cloud.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

VIRTIS, a bordo di Venus Express, è uno spettrometro in grado di operare da 0.25 a 5 µm. Nel periodo 2006-2011 ha ricavato un'enorme mole di dati ma a tutt'oggi le osservazioni al lembo sono poco utilizzate per lo studio delle nubi e delle hazes, specialmente di notte. Gli spettri al lembo a quote mesosferiche sono dominati dalla radianza proveniente dalle nubi e scatterata in direzione dello strumento dalle hazes. L'interpretazione degli spettri al lembo non può quindi prescindere dalla caratterizzazione dell'intera colonna atmosferica. L'obiettivo della tesi è di effettuare un’analisi statistica sulle osservazioni al nadir e proporre una metodologia per ricavare una caratterizzazione delle hazes combinando osservazioni al nadir e al lembo. La caratterizzazione delle nubi è avvenuta su un campione di oltre 3700 osservazioni al nadir. È stato creato un ampio dataset di spettri sintetici modificando, in un modello iniziale, vari parametri di nube quali composizione chimica, numero e dimensione delle particelle. Un processo di fit è stato applicato alle osservazioni per stabilire quale modello potesse descrivere gli spettri osservati. Si è poi effettuata una analisi statistica sui risultati del campione. Si è ricavata una concentrazione di acido solforico molto elevata nelle nubi basse, pari al 96% in massa, che si discosta dal valore generalmente utilizzato del 75%. Si sono poi integrati i risultati al nadir con uno studio mirato su poche osservazioni al lembo, selezionate in modo da intercettare nel punto di tangenza la colonna atmosferica osservata al nadir, per ricavare informazioni sulle hazes. I risultati di un modello Monte Carlo indicano che il numero e le dimensioni delle particelle previste dal modello base devono essere ridotte in maniera significativa. In particolare si osserva un abbassamento della quota massima delle hazes rispetto ad osservazioni diurne.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this thesis is to provide a geochemical characterization of the Seehausen territory (a neighborhood) of Bremen, Germany. In this territory it is hosted a landfill of dredged sediments coming both from Bremerhaven (North See) and Bremen harbor (directly on the river Weser). For this reason this work has been focused also on possible impacts of the landfill on the groundwaters (shallow and deep aquifer). The Seehausen landfill uses the dewatering technique to manage the dredged sediments: incoming sediments are put into dewatering fields until they are completely dried (it takes almost a year). Then they are randomly sampled and analyzed: if the pollutants content is acceptable, sediments are treated with other materials and used instead of raw material for embankment, bricks, etc., otherwise they are disposed in the landfill. During this work it has been made a study of the natural geology and hydrogeology of the whole area of interest, especially because it is characterized by ancient natural salt deposits. Then, together with the Geological Survey of Bremen and the Harbor Authority of Bremen there have been identified all useful piezometers for a monitoring net around the landfill. During the sampling campaign there have been collected data of the principal anions and cations, physical parameters and stable water isotopes. Data analysis has been focused particularly on Cl, Na, SO4 and EC because these parameters might be helpful to attribute geochemical trends to the landfill or to a natural background. Furthermore dataloggers have been installed for a month in some piezometers and EC, pressure, dissolved oxygen and temperature data have been collected. Finally there has been made a deep comparison between current and historical data (1996 – 2011) and between old interpolation maps and current ones in order to see time trends of the aquifer geochemistry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is developed in the contest of Ritmare project WP1, which main objective is the development of a sustainable fishery through the identification of populations boundaries in commercially important species in Italian Seas. Three main objectives are discussed in order to help reach the main purpose of identification of stock boundaries in Parapenaeus longirostris: 1 -Development of a representative sampling design for Italian seas; 2 -Evaluation of 2b-RAD protocol; 3 -Investigation of populations through biological data analysis. First of all we defined and accomplished a sampling design which properly represents all Italian seas. Then we used information and data about nursery areas distribution, abundance of populations and importance of P. longirostris in local fishery, to develop an experimental design that prioritize the most important areas to maximize the results with actual project funds. We introduced for the first time the use of 2b-RAD on this species, a genotyping method based on sequencing the uniform fragments produced by type IIB restriction endonucleases. Thanks to this method we were able to move from genetics to the more complex genomics. In order to proceed with 2b-RAD we performed several tests to identify the best DNA extraction kit and protocol and finally we were able to extract 192 high quality DNA extracts ready to be processed. We tested 2b-RAD with five samples and after high-throughput sequencing of libraries we used the software “Stacks” to analyze the sequences. We obtained positive results identifying a great number of SNP markers among the five samples. To guarantee a multidisciplinary approach we used the biological data associated to the collected samples to investigate differences between geographical samples. Such approach assures continuity with other project, for instance STOCKMED, which utilize a combination of molecular and biological analysis as well.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studio ed analisi delle principali tecniche in ambito di Social Data Analysis. Progettazione e Realizzazione di una soluzione software implementata con linguaggio Java in ambiente Eclipse. Il software realizzato permette di integrare differenti servizi di API REST, per l'estrazione di dati sociali da Twitter, la loro memorizzazione in un database non-relazionale (realizzato con MongoDB), e la loro gestione. Inoltre permette di effettuare operazioni di classificazione di topic, e di analizzare dati complessivi sulle collection di dati estratti. Infine permette di visualizzare un albero delle "ricondivisioni", partendo da singoli tweet selezionati, ed una mappa geo-localizzata, contenente gli utenti coinvolti nella catena di ricondivisioni, e i relativi archi di "retweet".

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the outlook of improving seismic vulnerability assessment for the city of Bishkek (Kyrgyzstan), the global dynamic behaviour of four nine-storey r.c. large-panel buildings in elastic regime is studied. The four buildings were built during the Soviet era within a serial production system. Since they all belong to the same series, they have very similar geometries both in plan and in height. Firstly, ambient vibration measurements are performed in the four buildings. The data analysis composed of discrete Fourier transform, modal analysis (frequency domain decomposition) and deconvolution interferometry, yields the modal characteristics and an estimate of the linear impulse response function for the structures of the four buildings. Then, finite element models are set up for all four buildings and the results of the numerical modal analysis are compared with the experimental ones. The numerical models are finally calibrated considering the first three global modes and their results match the experimental ones with an error of less then 20%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of my study is to investigate the relationship between selected deictic shields on the pronoun ‘I’ and the involvement/detachment dichotomy in a sample of television news interviews. I focus on the use of personal pronouns in political discourse. Drawing upon Caffi’s (2007) classification of mitigating devices into bushes, hedges and shields, I focus on deictic shields on the pronoun ‘I’: I examine the way a selection of ‘I’-related deictic shields is employed in a collection of news interviews broadcast during the electoral campaign prior to the UK 2015 General Election. My purpose is to uncover the frequencies of each of the linguistic items selected and the pragmatic functions of those linguistic items in the involvement/detachment dichotomy. The research is structured as follows. Chapter 1 provides an account of previous studies on the three main areas of research: speech event analysis, institutional interaction and the news interview, and the UK 2015 General Election television programmes. Chapter 2 is centred on the involvement/detachment dichotomy: I provide an overview of nonlinguistic and linguistic features of involvement and detachment at all levels of sentence structure. Chapter 3 contains a detailed account of the data collection and data analysis process. Chapter 4 provides an accurate description of results in three steps: quantitative analysis, qualitative analysis and discussion of the pragmatic functions of the selected linguistic features of involvement and detachment. Chapter 5 includes a brief summary of the investigation, reviews the main findings, and indicates limitations of the study and possible inputs for further research. The results of the analysis confirm that, while some of the linguistic items examined point toward involvement, others have a detaching effect. I therefore conclude that deictic shields on the pronoun ‘I’ permit the realisation of the involvement/detachment dichotomy in the speech genre of the news interview.