827 resultados para Twitter Financial Market Pearson cross correlation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Da ormai sette anni la stazione permanente GPS di Baia Terranova acquisisce dati giornalieri che opportunamente elaborati consentono di contribuire alla comprensione della dinamica antartica e a verificare se modelli globali di natura geofisica siano aderenti all’area di interesse della stazione GPS permanente. Da ricerche bibliografiche condotte si è dedotto che una serie GPS presenta molteplici possibili perturbazioni principalmente dovute a errori nella modellizzazione di alcuni dati ancillari necessari al processamento. Non solo, da alcune analisi svolte, è emerso come tali serie temporali ricavate da rilievi geodetici, siano afflitte da differenti tipologie di rumore che possono alterare, se non opportunamente considerate, i parametri di interesse per le interpretazioni geofisiche del dato. Il lavoro di tesi consiste nel comprendere in che misura tali errori, possano incidere sui parametri dinamici che caratterizzano il moto della stazione permanente, facendo particolare riferimento alla velocità del punto sul quale la stazione è installata e sugli eventuali segnali periodici che possono essere individuati.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A regional envelope curve (REC) of flood flows summarises the current bound on our experience of extreme floods in a region. RECs are available for most regions of the world. Recent scientific papers introduced a probabilistic interpretation of these curves and formulated an empirical estimator of the recurrence interval T associated with a REC, which, in principle, enables us to use RECs for design purposes in ungauged basins. The main aim of this work is twofold. First, it extends the REC concept to extreme rainstorm events by introducing the Depth-Duration Envelope Curves (DDEC), which are defined as the regional upper bound on all the record rainfall depths at present for various rainfall duration. Second, it adapts the probabilistic interpretation proposed for RECs to DDECs and it assesses the suitability of these curves for estimating the T-year rainfall event associated with a given duration and large T values. Probabilistic DDECs are complementary to regional frequency analysis of rainstorms and their utilization in combination with a suitable rainfall-runoff model can provide useful indications on the magnitude of extreme floods for gauged and ungauged basins. The study focuses on two different national datasets, the peak over threshold (POT) series of rainfall depths with duration 30 min., 1, 3, 9 and 24 hrs. obtained for 700 Austrian raingauges and the Annual Maximum Series (AMS) of rainfall depths with duration spanning from 5 min. to 24 hrs. collected at 220 raingauges located in northern-central Italy. The estimation of the recurrence interval of DDEC requires the quantification of the equivalent number of independent data which, in turn, is a function of the cross-correlation among sequences. While the quantification and modelling of intersite dependence is a straightforward task for AMS series, it may be cumbersome for POT series. This paper proposes a possible approach to address this problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Realisierung einer 3D-Kreuzkorrelationsanlage zur Untersuchung von Struktur und Dynamik hochkonzentrierter Kolloide Im Rahmen dieser Arbeit wird eine neuartige 3D-Kreuzkorrelationsanlage zur mehrfachstreufreien Untersuchung des diffusiven Verhaltens hochkonzentrierter kolloidaler Suspensionen vorgestellt. Hierzu werden zwei Lichtstreuexperimente gleichzeitig am gleichen Streuvolumen und mit dem gleichen Streuvektor durchgeführt. Aus der so gewonnenen Kreuzkorrelationsfunktion kann das dynamische Verhalten der Kolloide bestimmt werden. Für die Diffusion der Partikel spielen neben der direkten Wechselwirkung elektroviskoser Effekt und die hydrodynamische Wechselwirkung eine entscheidende Rolle. Insbesondere bei hohen Konzentrationen kann keiner der drei Effekte vernachlässigt werden. Die zu messenden Unterschiede in den Diffusionskoeffizienten sind sehr klein. Daher wurde der experimentelle Aufbau detailliert charakterisiert. Hierbei konnten theoretische Überlegungen hinsichtlich des Nachpulsens und der Totzeit der verwendeten Si-Avalanche-Photodioden überprüft werden. Der Kurzzeitselbstdiffusionskoeffizient hochkonzentrierter geladener kolloidaler Suspensionen wurde gemessen. Um die Daten bei hohen Konzentrationen korrekt zu normieren, wurde der elektroviskose Effekt bei geringen Konzentrationen ausführlich untersucht. Hierbei zeigte sich, dass der elektroviskose Einzelteilcheneffekt zu einer monotonen Abnahme des Diffusionskoeffizienten bei abnehmender Ionenstärke führt. Anhand der volumenbruchabhängigen Daten des Kurzzeitselbstdiffusionskoeffizienten konnte zum ersten Mal gezeigt werden, dass die hydrodynamische Wechselwirkung einen geringeren Einfluss auf die Diffusion hat, falls das direkte Wechselwirkungspotential ein Coulomb-Potential anstelle eines Harte-Kugel-Potentials ist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The southern Apennines of Italy have been experienced several destructive earthquakes both in historic and recent times. The present day seismicity, characterized by small-to-moderate magnitude earthquakes, was used like a probe to obatin a deeper knowledge of the fault structures where the largest earthquakes occurred in the past. With the aim to infer a three dimensional seismic image both the problem of data quality and the selection of a reliable and robust tomographic inversion strategy have been faced. The data quality has been obtained to develop optimized procedures for the measurements of P- and S-wave arrival times, through the use of polarization filtering and to the application of a refined re-picking technique based on cross-correlation of waveforms. A technique of iterative tomographic inversion, linearized, damped combined with a strategy of multiscale inversion type has been adopted. The retrieved P-wave velocity model indicates the presence of a strong velocity variation along a direction orthogonal to the Apenninic chain. This variation defines two domains which are characterized by a relatively low and high velocity values. From the comparison between the inferred P-wave velocity model with a portion of a structural section available in literature, the high velocity body was correlated with the Apulia carbonatic platforms whereas the low velocity bodies was associated to the basinal deposits. The deduced Vp/Vs ratio shows that the ratio is lower than 1.8 in the shallower part of the model, while for depths ranging between 5 km and 12 km the ratio increases up to 2.1 in correspondence to the area of higher seismicity. This confirms that areas characterized by higher values are more prone to generate earthquakes as a response to the presence of fluids and higher pore-pressures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Myocardial perfusion quantification by means of Contrast-Enhanced Cardiac Magnetic Resonance images relies on time consuming frame-by-frame manual tracing of regions of interest. In this Thesis, a novel automated technique for myocardial segmentation and non-rigid registration as a basis for perfusion quantification is presented. The proposed technique is based on three steps: reference frame selection, myocardial segmentation and non-rigid registration. In the first step, the reference frame in which both endo- and epicardial segmentation will be performed is chosen. Endocardial segmentation is achieved by means of a statistical region-based level-set technique followed by a curvature-based regularization motion. Epicardial segmentation is achieved by means of an edge-based level-set technique followed again by a regularization motion. To take into account the changes in position, size and shape of myocardium throughout the sequence due to out of plane respiratory motion, a non-rigid registration algorithm is required. The proposed non-rigid registration scheme consists in a novel multiscale extension of the normalized cross-correlation algorithm in combination with level-set methods. The myocardium is then divided into standard segments. Contrast enhancement curves are computed measuring the mean pixel intensity of each segment over time, and perfusion indices are extracted from each curve. The overall approach has been tested on synthetic and real datasets. For validation purposes, the sequences have been manually traced by an experienced interpreter, and contrast enhancement curves as well as perfusion indices have been computed. Comparisons between automatically extracted and manually obtained contours and enhancement curves showed high inter-technique agreement. Comparisons of perfusion indices computed using both approaches against quantitative coronary angiography and visual interpretation demonstrated that the two technique have similar diagnostic accuracy. In conclusion, the proposed technique allows fast, automated and accurate measurement of intra-myocardial contrast dynamics, and may thus address the strong clinical need for quantitative evaluation of myocardial perfusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le scelte di asset allocation costituiscono un problema ricorrente per ogni investitore. Quest’ultimo è continuamente impegnato a combinare diverse asset class per giungere ad un investimento coerente con le proprie preferenze. L’esigenza di supportare gli asset manager nello svolgimento delle proprie mansioni ha alimentato nel tempo una vasta letteratura che ha proposto numerose strategie e modelli di portfolio construction. Questa tesi tenta di fornire una rassegna di alcuni modelli innovativi di previsione e di alcune strategie nell’ambito dell’asset allocation tattica, per poi valutarne i risvolti pratici. In primis verificheremo la sussistenza di eventuali relazioni tra la dinamica di alcune variabili macroeconomiche ed i mercati finanziari. Lo scopo è quello di individuare un modello econometrico capace di orientare le strategie dei gestori nella costruzione dei propri portafogli di investimento. L’analisi prende in considerazione il mercato americano, durante un periodo caratterizzato da rapide trasformazioni economiche e da un’elevata volatilità dei prezzi azionari. In secondo luogo verrà esaminata la validità delle strategie di trading momentum e contrarian nei mercati futures, in particolare quelli dell’Eurozona, che ben si prestano all’implementazione delle stesse, grazie all’assenza di vincoli sulle operazioni di shorting ed ai ridotti costi di transazione. Dall’indagine emerge che entrambe le anomalie si presentano con carattere di stabilità. I rendimenti anomali permangono anche qualora vengano utilizzati i tradizionali modelli di asset pricing, quali il CAPM, il modello di Fama e French e quello di Carhart. Infine, utilizzando l’approccio EGARCH-M, verranno formulate previsioni sulla volatilità dei rendimenti dei titoli appartenenti al Dow Jones. Quest’ultime saranno poi utilizzate come input per determinare le views da inserire nel modello di Black e Litterman. I risultati ottenuti, evidenziano, per diversi valori dello scalare tau, extra rendimenti medi del new combined vector superiori al vettore degli extra rendimenti di equilibrio di mercato, seppur con livelli più elevati di rischio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In seguito ad una disamina del materiale presente in letteratura, ci siamo chiesti se i numerosi investimenti pubblicitari, promozionali, di marketing e, per dirla in una parola sola “intangibili”, generassero un aumento del valore dell’impresa nel contesto valutativo oppure se dessero origine esclusivamente ad aumenti di fatturato. L’obiettivo più ambito consiste nel capitalizzare gli investimenti su attività intangibili come la costruzione del marchio, l’utilizzo di brevetti, le operazioni rivolte alla soddisfazione del cliente e tutto quanto si possa definire immateriale. Eppure coesistono nel mare magnum della stessa azienda. Fino a quando non si potrà inserire criteri di valutazione d’azienda delle performance di marketing non ci potrà essere crescita in quanto, le risorse, sono utilizzate senza un criterio di ritorno di investimento.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interplay of hydrodynamic and electrostatic forces is of great importance for the understanding of colloidal dispersions. Theoretical descriptions are often based on the so called standard electrokinetic model. This Mean Field approach combines the Stokes equation for the hydrodynamic flow field, the Poisson equation for electrostatics and a continuity equation describing the evolution of the ion concentration fields. In the first part of this thesis a new lattice method is presented in order to efficiently solve the set of non-linear equations for a charge-stabilized colloidal dispersion in the presence of an external electric field. Within this framework, the research is mainly focused on the calculation of the electrophoretic mobility. Since this transport coefficient is independent of the electric field only for small driving, the algorithm is based upon a linearization of the governing equations. The zeroth order is the well known Poisson-Boltzmann theory and the first order is a coupled set of linear equations. Furthermore, this set of equations is divided into several subproblems. A specialized solver for each subproblem is developed, and various tests and applications are discussed for every particular method. Finally, all solvers are combined in an iterative procedure and applied to several interesting questions, for example, the effect of the screening mechanism on the electrophoretic mobility or the charge dependence of the field-induced dipole moment and ion clouds surrounding a weakly charged sphere. In the second part a quantitative data analysis method is developed for a new experimental approach, known as "Total Internal Reflection Fluorescence Cross-Correlation Spectroscopy" (TIR-FCCS). The TIR-FCCS setup is an optical method using fluorescent colloidal particles to analyze the flow field close to a solid-fluid interface. The interpretation of the experimental results requires a theoretical model, which is usually the solution of a convection-diffusion equation. Since an analytic solution is not available due to the form of the flow field and the boundary conditions, an alternative numerical approach is presented. It is based on stochastic methods, i. e. a combination of a Brownian Dynamics algorithm and Monte Carlo techniques. Finally, experimental measurements for a hydrophilic surface are analyzed using this new numerical approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A unique characteristic of soft matter is its ability to self-assemble into larger structures. Characterizing these structures is crucial for their applications. In the first part of this work, I investigated DNA-organic hybrid material by means of Fluorescence Correlation Spectroscopy (FCS) and Fluorescence Cross-Correlation Spectroscopy (FCCS). DNA-organic hybrid materials, a novel class of hybrid materials composed of synthetic macromolecules and oligodeoxynucleotide segmenta, are mostly amphiphilic and can self-assemble into supramolecular structures in aqueous solution. A hybrid material of a fluorophore, perylenediimide (PDI), and a DNA segment (DNA-PDI) has been developed in Prof. A. Hermann’s group (University of Groningen). This novel material has the ability to form aggregates through pi-pi stacking between planar PDIs and can be traced in solution due to the fluorescence of PDI. I have determined the diffusion coefficient of DNA-PDI conjugates in aqueous solution by means of FCS. In addition, I investigated whether such DNA-PDIs form aggregates with certain structure, for instance dimers. rnOnce the DNA hybrid material self-assemble into supermolecular structures for instance into micelles, the single molecules do not necessarily stay in one specific micelle. Actually, a single molecule may enter and leave micelles constantly. The average residence time of a single molecule in a certain micelle depends on the nature of the molecule. I have chosen DNA-b-polypropylene oxide (PPO) as model molecules and investigated the residence time of DNA-b-PPO molecules in their according micelles by means of FCCS.rnBesides the DNA hybrid materials, polymeric colloids can also form ordered structures once they are brought to an air/water interface. Here, hexagonally densely packed monolayers can be generated. These monolayers can be deposited onto different surfaces as coating layers. In the second part of this work, I investigated the mechanical properties of such colloidal monolayers using micromechanical cantilevers. When a coating layer is deposited on a cantilever, it can modify the elasticity of the cantilever. This variation can be reflected either by a deflection or by a resonance frequency shift of the cantilever. In turn, detecting these changes provides information about the mechanical properties of the coating layer. rnIn the second part of this work, polymeric colloidal monolayers were coated on a cantilever and homogenous polymer films of a few hundred nanometers in thickness were generated from these colloidal monolayers by thermal annealing or organic vapor annealing. Both the film formation process and the mechanical properties of these resulting homogenous films were investigated by means of cantilever. rnElastic property changes of the coating film, for example upon absorption of organic vapors, induce a deflection of the cantilever. This effect enables a cantilever to detect target molecules, when the cantilever is coated with an active layer with specific affinity to target molecules. In the last part of this thesis, I investigated the applicability of suitably functionalized micromechanical cantilevers as sensors. In particular, glucose sensitive polymer brushes were grafted on a cantilever and the deflection of this cantilever was measured during exposure to glucose solution. rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il lavoro propone un’analisi critica delle disciplina italiana ed europea della consulenza in materia di investimenti. Si considerano innanzitutto le problematiche generali del rapporto tra cliente e intermediario nella prestazione della consulenza, con particolare riferimento ad asimmetrie informative e conflitti di interesse. Si discute, in particolare, il tradizionale paradigma regolamentare fondato sulla trasparenza: le indicazioni della finanza comportamentale suggeriscono, infatti, un intervento normativo più deciso, volto a caratterizzare in maniera fiduciaria la relazione tra cliente e intermediario. Dopo aver analizzato, alla stregua dei modelli teorici illustrati in precedenza, l’evoluzione storica della disciplina della consulenza nell’ordinamento italiano, si sottolinea il ruolo svolto dall’autorità di vigilanza nella sistematizzazione dell’istituto, nel contempo rilevando, tuttavia, la complessiva insufficienza delle norme vigenti al fine di un’adeguata tutela dell’investitore. Si esamina poi la disciplina introdotta dalla MiFID, con specifica attenzione alle implicazioni sistematiche dell’estensione della nozione di consulenza operata dalle autorità di vigilanza: la nuova configurazione del servizio ha determinato un’intensificazione dei doveri fiduciari imposti agli intermediari, testimoniando un superamento del paradigma di trasparenza e un percorso indirizzato verso un approccio maggiormente interventista sul lato dell’offerta. Le conclusioni sono, peraltro, nel senso di uno sviluppo solo parziale di tale processo, risultando dubbio il valore per il cliente di una consulenza non indipendente e potenzialmente esposta, soprattutto negli intermediari polifunzionali, al conflitto di interessi. Particolare enfasi viene posta sulla necessità di introdurre un’effettiva consulenza indipendente, pur nelle difficoltà che la relativa disciplina incontra, anche in ragione delle caratteristiche specifiche del mercato, nell’ordinamento italiano.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past ten years, the cross-correlation of long-time series of ambient seismic noise (ASN) has been widely adopted to extract the surface-wave part of the Green’s Functions (GF). This stochastic procedure relies on the assumption that ASN wave-field is diffuse and stationary. At frequencies <1Hz, the ASN is mainly composed by surface-waves, whose origin is attributed to the sea-wave climate. Consequently, marked directional properties may be observed, which call for accurate investigation about location and temporal evolution of the ASN-sources before attempting any GF retrieval. Within this general context, this thesis is aimed at a thorough investigation about feasibility and robustness of the noise-based methods toward the imaging of complex geological structures at the local (∼10-50km) scale. The study focused on the analysis of an extended (11 months) seismological data set collected at the Larderello-Travale geothermal field (Italy), an area for which the underground geological structures are well-constrained thanks to decades of geothermal exploration. Focusing on the secondary microseism band (SM;f>0.1Hz), I first investigate the spectral features and the kinematic properties of the noise wavefield using beamforming analysis, highlighting a marked variability with time and frequency. For the 0.1-0.3Hz frequency band and during Spring- Summer-time, the SMs waves propagate with high apparent velocities and from well-defined directions, likely associated with ocean-storms in the south- ern hemisphere. Conversely, at frequencies >0.3Hz the distribution of back- azimuths is more scattered, thus indicating that this frequency-band is the most appropriate for the application of stochastic techniques. For this latter frequency interval, I tested two correlation-based methods, acting in the time (NCF) and frequency (modified-SPAC) domains, respectively yielding esti- mates of the group- and phase-velocity dispersions. Velocity data provided by the two methods are markedly discordant; comparison with independent geological and geophysical constraints suggests that NCF results are more robust and reliable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radial velocities measured from near-infrared (NIR) spectra are a potential tool to search for extrasolar planets around cool stars. High resolution infrared spectrographs now available reach the high precision of visible instruments, with a constant improvement over time. GIANO is an infrared echelle spectrograph and it is a powerful tool to provide high resolution spectra for accurate radial velocity measurements of exo-planets and for chemical and dynamical studies of stellar or extragalactic objects. No other IR instruments have the GIANO's capability to cover the entire NIR wavelength range. In this work we develop an ensemble of IDL procedures to measure high precision radial velocities on a few GIANO spectra acquired during the commissioning run, using the telluric lines as wevelength reference. In Section 1.1 various exoplanet search methods are described. They exploit different properties of the planetary system. In Section 1.2 we describe the exoplanet population discovered trough the different methods. In Section 1.3 we explain motivations for NIR radial velocities and the challenges related the main issue that has limited the pursuit of high-precision NIR radial velocity, that is, the lack of a suitable calibration method. We briefly describe calibration methods in the visible and the solutions for IR calibration, for instance, the use of telluric lines. The latter has advantages and problems, described in detail. In this work we use telluric lines as wavelength reference. In Section 1.4 the Cross Correlation Function (CCF) method is described. This method is widely used to measure the radial velocities.In Section 1.5 we describe GIANO and its main science targets. In Chapter 2 observational data obtained with GIANO spectrograph are presented and the choice criteria are reported. In Chapter 3 we describe the detail of the analysis and examine in depth the flow chart reported in Section 3.1. In Chapter 4 we give the radial velocities measured with our IDL procedure for all available targets. We obtain an rms scatter in radial velocities of about 7 m/s. Finally, we conclude that GIANO can be used to measure radial velocities of late type stars with an accuracy close to or better than 10 m/s, using telluric lines as wevelength reference. In 2014 September GIANO is being operative at TNG for Science Verification and more observational data will allow to further refine this analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un LiDAR è uno strumento di misura che sta vedendo uno sviluppo enorme negli ultimi decenni e sta dando risultati di grande utilità pratica. Abbiamo svolto alcune misure di distanza utilizzando uno strumento realizzato con materiale di recupero e un semplice software scritto da noi. In una prima parte del lavoro, più teorica, si illustrerà il funzionamento dello strumen- to, basato sull’invio di fasci laser su bersagli opachi e sulla ricezione della loro riflessione. Si presterà particolare attenzione ai metodi sviluppati per poter sfruttare laser continui piuttosto che impulsati, che risulterebbero più costosi: le sequenze pseudocasuali di bit. Nella parte sperimentale invece si mostrerà l’analisi dei dati effettuata e si commen- teranno i risultati ottenuti osservando le misure, con lo scopo di verificare alcune ipotesi, fra cui si darà particolare attenzione al confronto delle diverse sequenze. Lo scopo di questo lavoro è caratterizzare lo strumento tramite l’analisi delle misure e verificare l’asserzione dell’articolo [1] in bibliografia secondo cui particolari sequenze di bit (A1 e A2) darebbero risultati migliori se utilizzate al posto della sequenza pseudocasuale di lunghezza massima, M-sequence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il problema dell'acidificazione degli oceani, conseguente ai cambiamenti climatici, è un processo ancora poco conosciuto. Per comprendere questo fenomeno, possono essere utilizzati degli ambienti naturalmente acidificati, considerati laboratori a cielo aperto. Lo scopo di questo lavoro di tesi è stato quello di utilizzare le fumarole presenti nell'isola di Ischia, per approfondire le dinamiche dei processi di acidificazione e per analizzare l'eventuale interazione tra pH e condizioni meteorologiche. I dati utilizzati, forniti dalla Stazione Zoologica “Anton Dohrn” di Napoli, erano serie di pH e di vento rilevate in continuo, in due aree, nord e sud rispetto all'isolotto del Castello Aragonese, e in tre stazioni lungo un gradiente di acidificazione. Tutto il lavoro è stato svolto a step, dove il risultato di un'analisi suggeriva il tipo e il metodo analitico da utilizzare nelle analisi successive. Inizialmente i dati delle due serie sono stati analizzati singolarmente per ottenere i parametri più salienti delle due serie. In seguito i dati sono stati correlati fra loro per stimare l'influenza del vento sul pH. Globalmente è stato possibile evidenziare come il fenomeno dell'acidificazione sia correlato con il vento, ma la risposta sembra essere sito-specifica, essendo risultato dipendente da altri fattori interagenti a scala locale, come la geomorfologia del territorio, le correnti marine e la batimetria del fondale. È però emersa anche la difficoltà nel trovare chiare correlazioni fra le due serie indagate, perché molto complesse, a causa sia della numerosa quantità di zeri nella serie del vento, sia da una forte variabilità naturale del pH, nelle varie stazioni esaminate. In generale, con questo lavoro si è dimostrato come utilizzare tecniche di analisi delle serie storiche, e come poter utilizzare metodi di regressione, autocorrelazione, cross-correlation e smoothing che possono integrare i modelli che prendono in considerazione variabili esogene rispetto alla variabile di interesse.