13 resultados para joint time-frequency analysis
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
This thesis work aims to find a procedure for isolating specific features of the current signal from a plasma focus for medical applications. The structure of the current signal inside a plasma focus is exclusive of this class of machines and a specific analysis procedure has to be developed. The hope is to find one or more features that shows a correlation with the dose erogated. The study of the correlation between the current discharge signal and the dose delivered by a plasma focus could be of some importance not only for the practical application of dose prediction but also for expanding the knowledge anbout the plasma focus physics. Vatious classes of time-frequency analysis tecniques are implemented in order to solve the problem.
Resumo:
The surface of the Earth is subjected to vertical deformations caused by geophysical and geological processes which can be monitored by Global Positioning System (GPS) observations. The purpose of this work is to investigate GPS height time series to identify interannual signals affecting the Earth’s surface over the European and Mediterranean area, during the period 2001-2019. Thirty-six homogeneously distributed GPS stations were selected from the online dataset made available by the Nevada Geodetic Laboratory (NGL) on the basis of the length and quality of the data series. The Principal Component Analysis (PCA) is the technique applied to extract the main patterns of the space and time variability of the GPS Up coordinate. The time series were studied by means of a frequency analysis using a periodogram and the real-valued Morlet wavelet. The periodogram is used to identify the dominant frequencies and the spectral density of the investigated signals; the second one is applied to identify the signals in the time domain and the relevant periodicities. This study has identified, over European and Mediterranean area, the presence of interannual non-linear signals with a period of 2-to-4 years, possibly related to atmospheric and hydrological loading displacements and to climate phenomena, such as El Niño Southern Oscillation (ENSO). A clear signal with a period of about six years is present in the vertical component of the GPS time series, likely explainable by the gravitational coupling between the Earth’s mantle and the inner core. Moreover, signals with a period in the order of 8-9 years, might be explained by mantle-inner core gravity coupling and the cycle of the lunar perigee, and a signal of 18.6 years, likely associated to lunar nodal cycle, were identified through the wavelet spectrum. However, these last two signals need further confirmation because the present length of the GPS time series is still too short when compared to the periods involved.
Resumo:
The work for the present thesis started in California, during my semester as an exchange student overseas. California is known worldwide for its seismicity and its effort in the earthquake engineering research field. For this reason, I immediately found interesting the Structural Dynamics Professor, Maria Q. Feng's proposal, to work on a pushover analysis of the existing Jamboree Road Overcrossing bridge. Concrete is a popular building material in California, and for the most part, it serves its functions well. However, concrete is inherently brittle and performs poorly during earthquakes if not reinforced properly. The San Fernando Earthquake of 1971 dramatically demonstrated this characteristic. Shortly thereafter, code writers revised the design provisions for new concrete buildings so to provide adequate ductility to resist strong ground shaking. There remain, nonetheless, millions of square feet of non-ductile concrete buildings in California. The purpose of this work is to perform a Pushover Analysis and compare the results with those of a Nonlinear Time-History Analysis of an existing bridge, located in Southern California. The analyses have been executed through the software OpenSees, the Open System for Earthquake Engineering Simulation. The bridge Jamboree Road Overcrossing is classified as a Standard Ordinary Bridge. In fact, the JRO is a typical three-span continuous cast-in-place prestressed post-tension box-girder. The total length of the bridge is 366 ft., and the height of the two bents are respectively 26,41 ft. and 28,41 ft.. Both the Pushover Analysis and the Nonlinear Time-History Analysis require the use of a model that takes into account for the nonlinearities of the system. In fact, in order to execute nonlinear analyses of highway bridges it is essential to incorporate an accurate model of the material behavior. It has been observed that, after the occurrence of destructive earthquakes, one of the most damaged elements on highway bridges is a column. To evaluate the performance of bridge columns during seismic events an adequate model of the column must be incorporated. Part of the work of the present thesis is, in fact, dedicated to the modeling of bents. Different types of nonlinear element have been studied and modeled, with emphasis on the plasticity zone length determination and location. Furthermore, different models for concrete and steel materials have been considered, and the selection of the parameters that define the constitutive laws of the different materials have been accurate. The work is structured into four chapters, to follow a brief overview of the content. The first chapter introduces the concepts related to capacity design, as the actual philosophy of seismic design. Furthermore, nonlinear analyses both static, pushover, and dynamic, time-history, are presented. The final paragraph concludes with a short description on how to determine the seismic demand at a specific site, according to the latest design criteria in California. The second chapter deals with the formulation of force-based finite elements and the issues regarding the objectivity of the response in nonlinear field. Both concentrated and distributed plasticity elements are discussed into detail. The third chapter presents the existing structure, the software used OpenSees, and the modeling assumptions and issues. The creation of the nonlinear model represents a central part in this work. Nonlinear material constitutive laws, for concrete and reinforcing steel, are discussed into detail; as well as the different scenarios employed in the columns modeling. Finally, the results of the pushover analysis are presented in chapter four. Capacity curves are examined for the different model scenarios used, and failure modes of concrete and steel are discussed. Capacity curve is converted into capacity spectrum and intersected with the design spectrum. In the last paragraph, the results of nonlinear time-history analyses are compared to those of pushover analysis.
Resumo:
Analysis of the collapse of a precast r.c. industrial building during the 2012 Emilia earthquake, focus on the failure mechanisms in particular on the flexure-shear interactions. Analysis performed by a time history analysis using a FEM model with the software SAP2000. Finally a reconstruction of the collapse on the basis of the numerical data coming from the strength capacity of the elements failed, using formulation for lightly reinforced columns with high shear and bending moment.
Resumo:
Altough nowadays DMTA is one of the most used techniques to characterize polymers thermo-mechanical behaviour, it is only effective for small amplitude oscillatory tests and limited to a single frequency analysis (linear regime). In this thesis work a Fourier transform based experimental system has proven to give hint on structural and chemical changes in specimens during large amplitude oscillatory tests exploiting multi frequency spectral analysis turning out in a more sensitive tool than classical linear approach. The test campaign has been focused on three test typologies: Strain sweep tests, Damage investigation and temperature sweep tests.
Resumo:
One of the main process features under study in Cognitive Translation & Interpreting Studies (CTIS) is the chronological unfolding of the tasks. The analyses of time spans in translation have been conceived in two ways: (1) studying those falling between text units of different sizes: words, phrases, sentences, and paragraphs; (2) setting arbitrary time span thresholds to explore where do they fall in the text, whether between text units or not. Writing disfluencies may lead to comprehensive insights into the cognitive activities involved in typing while translating. Indeed, long time spans are often taken as hints that cognitive resources have been subtracted from typing and devoted to other activities, such as planning, evaluating, etc. This exploratory, pilot study combined both approaches to seek potential general tendencies and contrasts in informants’ inferred mental processes when performing different writing tasks, through the analysis of their behaviors, as keylogged. The study tasks were retyping, monolingual free writing, translation, revision and a multimodal task—namely, monolingual text production based on an infographic leaflet. Task logs were chunked, and shorter time spans, including those within words, were analyzed following the Task Segment Framework (Muñoz & Apfelthaler, in press). Finally, time span analysis was combined with the analysis of the texts as to their lexical density, type-token ratio and word frequency. Several previous results were confirmed, and some others were surprising. Time spans in free writing were longer between paragraphs and sentences, possibly hinting at planning and, in translation, between clauses and words, suggesting more cognitive activities at these levels. On the other hand, the infographic was expected to facilitate the writing process, but most time spans were longer than in both free writing and translation. Results of the multimodal task and some other results suggest venues for further research.
Resumo:
The discovery of the neutrino mass is a direct evidence of new physics. Several questions arise from this observation, regarding the mechanism originating the neutrino masses and their hierarchy, the violation of lepton number conservation and the generation of the baryon asymmetry. These questions can be addressed by the experimental search for neutrinoless double beta (0\nu\beta\beta) decay, a nuclear decay consisting of two simultaneous beta emissions without the emission of two antineutrinos. 0\nu\beta\beta decay is possible only if neutrinos are identical to antineutrinos, namely if they are Majorana particles. Several experiments are searching for 0\nu\beta\beta decay. Among these, CUORE is employing 130Te embedded in TeO_2 bolometric crystals. It needs to have an accurate understanding of the background contribution in the energy region around the Q-value of 130Te. One of the main contributions is given by particles from the decay chains of contaminating nuclei (232Th, 235-238U) present in the active crystals or in the support structure. This thesis uses the 1 ton yr CUORE data to study these contamination by looking for events belonging to sub-chains of the Th and U decay chains and reconstructing their energy and time difference distributions in a delayed coincidence analysis. These results in combination with studies on the simulated data are then used to evaluate the contaminations. This is the first time this analysis is applied to the CUORE data and this thesis highlights the feasibility of it while providing a starting point for further studies. A part of the obtained results agrees with ones from previous analysis, demonstrating that delayed coincidence searches might improve the understanding of the CUORE experiment background. This kind of delayed coincidence analysis can also be reused in the future once the, CUORE upgrade, CUPID data will be ready to be analyzed, with the aim of improving the sensitivity to the 0\nu\beta\beta decay of 100Mo.
Resumo:
Questo elaborato tratta lo studio e l'implementazione della modulazione Orthogonal Time Frequency Space (OTFS) in sistemi di tipo Joint Sensing and Communication (JSC), ossia sistemi in cui le funzionalità di comunicazione e "sensing" condividono lo stesso hardware e la stessa banda di frequenze. La modulazione OTFS è una recente tecnica di modulazione bidimensionale, progettata nel dominio delay-Doppler, che sta riscuotendo particolare interesse nel panorama delle telecomunicazioni in quanto portatrice di alcuni importanti vantaggi nei sistemi delle reti cellulari. Rispetto alle moderne modulazioni multi-portante, come l'OFDM, l'OTFS si dimostra essere in grado di offrire prestazioni migliori e più robuste in condizioni di connessioni ad alta velocità (High Mobility), con una maggiore efficienza spettrale, in quanto non prevede l'utilizzo di un prefisso ciclico per la gestione del problema dell'ISI. Un altro punto di forza importante della modulazione OTFS è quello di risultare particolarmente adatta e compatibile con le applicazioni di JSC, soprattutto grazie all'estrema robustezza a grandi spostamenti Doppler, e la possibilità di fare "sensing" raggiungendo distanze decisamente maggiori rispetto all'OFDM, grazie all'assenza del prefisso ciclico. Tutto questo però viene al costo di un notevole incremento del sistema, con tempi di esecuzione molto elevati. Dopo aver presentato il sistema OTFS in generale e in ambito JSC, in un particolare scenario, lo si è implementando in MATLAB sviluppando due versioni: una standard e una a complessità ridotta, in grado di raggiungere risoluzioni molto elevate in tempi di calcolo contenuti. Infine questo sistema, in grado di stimare i parametri radar di distanza e velocità radiale relativa, associati ad un certo target, è stato simulato e i risultati relativi alle performance sulla stima dei parametri, in termini di RMSE e CRLB, sono stati riportati e posti a confronto con quelli ottenuti nel caso OFDM, a parità di scenario.
Resumo:
Da ormai sette anni la stazione permanente GPS di Baia Terranova acquisisce dati giornalieri che opportunamente elaborati consentono di contribuire alla comprensione della dinamica antartica e a verificare se modelli globali di natura geofisica siano aderenti all’area di interesse della stazione GPS permanente. Da ricerche bibliografiche condotte si è dedotto che una serie GPS presenta molteplici possibili perturbazioni principalmente dovute a errori nella modellizzazione di alcuni dati ancillari necessari al processamento. Non solo, da alcune analisi svolte, è emerso come tali serie temporali ricavate da rilievi geodetici, siano afflitte da differenti tipologie di rumore che possono alterare, se non opportunamente considerate, i parametri di interesse per le interpretazioni geofisiche del dato. Il lavoro di tesi consiste nel comprendere in che misura tali errori, possano incidere sui parametri dinamici che caratterizzano il moto della stazione permanente, facendo particolare riferimento alla velocità del punto sul quale la stazione è installata e sugli eventuali segnali periodici che possono essere individuati.
Resumo:
A regional envelope curve (REC) of flood flows summarises the current bound on our experience of extreme floods in a region. RECs are available for most regions of the world. Recent scientific papers introduced a probabilistic interpretation of these curves and formulated an empirical estimator of the recurrence interval T associated with a REC, which, in principle, enables us to use RECs for design purposes in ungauged basins. The main aim of this work is twofold. First, it extends the REC concept to extreme rainstorm events by introducing the Depth-Duration Envelope Curves (DDEC), which are defined as the regional upper bound on all the record rainfall depths at present for various rainfall duration. Second, it adapts the probabilistic interpretation proposed for RECs to DDECs and it assesses the suitability of these curves for estimating the T-year rainfall event associated with a given duration and large T values. Probabilistic DDECs are complementary to regional frequency analysis of rainstorms and their utilization in combination with a suitable rainfall-runoff model can provide useful indications on the magnitude of extreme floods for gauged and ungauged basins. The study focuses on two different national datasets, the peak over threshold (POT) series of rainfall depths with duration 30 min., 1, 3, 9 and 24 hrs. obtained for 700 Austrian raingauges and the Annual Maximum Series (AMS) of rainfall depths with duration spanning from 5 min. to 24 hrs. collected at 220 raingauges located in northern-central Italy. The estimation of the recurrence interval of DDEC requires the quantification of the equivalent number of independent data which, in turn, is a function of the cross-correlation among sequences. While the quantification and modelling of intersite dependence is a straightforward task for AMS series, it may be cumbersome for POT series. This paper proposes a possible approach to address this problem.
Resumo:
The aim of present study is to define the general framework of Merluccius merluccius population structure, to estimate the growth rate and to assess the recruitment dynamics of juveniles from Northern and Central Adriatic, through otoliths analysis. The otoliths of hake specimens collected during the MedITS trawl survey in the 2012 in GSA 17, were cleaned and 102 otoliths out of 506 were embedded, sectioned, grindined and polished to obtain frontal and sagittal sections. The whole sample were analysed under stereomicroscope and optical microscope, with camera and connected to PC provided of an image analyses program. The frequency analysis of size classes and age revealed that the species is dominated by hake with >200mm TL and > one year old. The fish average size of M. merluccius at the end of the first year of life is about 199 mm TL. Allometrics analyses between fish TL and Feret (major axis), MiniFeret (minor axis), Area, Perimeter, showed a direct proportionality among lengths. Among the 88 otoliths sections analysed, the number of daily increments read ranged from 86 to 206, within 55 and 175mm TL range. The age estimate ranged from about 2-3 to 9 months and the growth rate from 20.99 to 27.15mm TL. The hatch-date distribution, obtained by back calculation, showed that the hatching occurs in November-March. In conclusion, strong preventive measures are needed for hake adults because the success of this species seems to be linked to deep water ecosystem protection where big spawners dwell.
Resumo:
PhEDEx, the CMS transfer management system, during the first LHC Run has moved about 150 PB and currently it is moving about 2.5 PB of data per week over the Worldwide LHC Computing Grid (WLGC). It was designed to complete each transfer required by users at the expense of the waiting time necessary for its completion. For this reason, after several years of operations, data regarding transfer latencies has been collected and stored into log files containing useful analyzable informations. Then, starting from the analysis of several typical CMS transfer workflows, a categorization of such latencies has been made with a focus on the different factors that contribute to the transfer completion time. The analysis presented in this thesis will provide the necessary information for equipping PhEDEx in the future with a set of new tools in order to proactively identify and fix any latency issues. PhEDEx, il sistema di gestione dei trasferimenti di CMS, durante il primo Run di LHC ha trasferito all’incirca 150 PB ed attualmente trasferisce circa 2.5 PB di dati alla settimana attraverso la Worldwide LHC Computing Grid (WLCG). Questo sistema è stato progettato per completare ogni trasferimento richiesto dall’utente a spese del tempo necessario per il suo completamento. Dopo svariati anni di operazioni con tale strumento, sono stati raccolti dati relativi alle latenze di trasferimento ed immagazzinati in log files contenenti informazioni utili per l’analisi. A questo punto, partendo dall’analisi di una ampia mole di trasferimenti in CMS, è stata effettuata una suddivisione di queste latenze ponendo particolare attenzione nei confronti dei fattori che contribuiscono al tempo di completamento del trasferimento. L’analisi presentata in questa tesi permetterà di equipaggiare PhEDEx con un insieme di utili strumenti in modo tale da identificare proattivamente queste latenze e adottare le opportune tattiche per minimizzare l’impatto sugli utenti finali.
Resumo:
This dissertation presents a calibration procedure for a pressure velocity probe. The dissertation is divided into four main chapters. The first chapter is divided into six main sections. In the firsts two, the wave equation in fluids and the velocity of sound in gases are calculated, the third section contains a general solution of the wave equation in the case of plane acoustic waves. Section four and five report the definition of the acoustic impedance and admittance, and the practical units the sound level is measured with, i.e. the decibel scale. Finally, the last section of the chapter is about the theory linked to the frequency analysis of a sound wave and includes the analysis of sound in bands and the discrete Fourier analysis, with the definition of some important functions. The second chapter describes different reference field calibration procedures that are used to calibrate the P-V probes, between them the progressive plane wave method, which is that has been used in this work. Finally, the last section of the chapter contains a description of the working principles of the two transducers that have been used, with a focus on the velocity one. The third chapter of the dissertation is devoted to the explanation of the calibration set up and the instruments used for the data acquisition and analysis. Since software routines were extremely important, this chapter includes a dedicated section on them and the proprietary routines most used are thoroughly explained. Finally, there is the description of the work that has been done, which is identified with three different phases, where the data acquired and the results obtained are presented. All the graphs and data reported were obtained through the Matlab® routine. As for the last chapter, it briefly presents all the work that has been done as well as an excursus on a new probe and on the way the procedure implemented in this dissertation could be applied in the case of a general field.