8 resultados para time-motion analysis
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Da ormai sette anni la stazione permanente GPS di Baia Terranova acquisisce dati giornalieri che opportunamente elaborati consentono di contribuire alla comprensione della dinamica antartica e a verificare se modelli globali di natura geofisica siano aderenti all’area di interesse della stazione GPS permanente. Da ricerche bibliografiche condotte si è dedotto che una serie GPS presenta molteplici possibili perturbazioni principalmente dovute a errori nella modellizzazione di alcuni dati ancillari necessari al processamento. Non solo, da alcune analisi svolte, è emerso come tali serie temporali ricavate da rilievi geodetici, siano afflitte da differenti tipologie di rumore che possono alterare, se non opportunamente considerate, i parametri di interesse per le interpretazioni geofisiche del dato. Il lavoro di tesi consiste nel comprendere in che misura tali errori, possano incidere sui parametri dinamici che caratterizzano il moto della stazione permanente, facendo particolare riferimento alla velocità del punto sul quale la stazione è installata e sugli eventuali segnali periodici che possono essere individuati.
Resumo:
The work for the present thesis started in California, during my semester as an exchange student overseas. California is known worldwide for its seismicity and its effort in the earthquake engineering research field. For this reason, I immediately found interesting the Structural Dynamics Professor, Maria Q. Feng's proposal, to work on a pushover analysis of the existing Jamboree Road Overcrossing bridge. Concrete is a popular building material in California, and for the most part, it serves its functions well. However, concrete is inherently brittle and performs poorly during earthquakes if not reinforced properly. The San Fernando Earthquake of 1971 dramatically demonstrated this characteristic. Shortly thereafter, code writers revised the design provisions for new concrete buildings so to provide adequate ductility to resist strong ground shaking. There remain, nonetheless, millions of square feet of non-ductile concrete buildings in California. The purpose of this work is to perform a Pushover Analysis and compare the results with those of a Nonlinear Time-History Analysis of an existing bridge, located in Southern California. The analyses have been executed through the software OpenSees, the Open System for Earthquake Engineering Simulation. The bridge Jamboree Road Overcrossing is classified as a Standard Ordinary Bridge. In fact, the JRO is a typical three-span continuous cast-in-place prestressed post-tension box-girder. The total length of the bridge is 366 ft., and the height of the two bents are respectively 26,41 ft. and 28,41 ft.. Both the Pushover Analysis and the Nonlinear Time-History Analysis require the use of a model that takes into account for the nonlinearities of the system. In fact, in order to execute nonlinear analyses of highway bridges it is essential to incorporate an accurate model of the material behavior. It has been observed that, after the occurrence of destructive earthquakes, one of the most damaged elements on highway bridges is a column. To evaluate the performance of bridge columns during seismic events an adequate model of the column must be incorporated. Part of the work of the present thesis is, in fact, dedicated to the modeling of bents. Different types of nonlinear element have been studied and modeled, with emphasis on the plasticity zone length determination and location. Furthermore, different models for concrete and steel materials have been considered, and the selection of the parameters that define the constitutive laws of the different materials have been accurate. The work is structured into four chapters, to follow a brief overview of the content. The first chapter introduces the concepts related to capacity design, as the actual philosophy of seismic design. Furthermore, nonlinear analyses both static, pushover, and dynamic, time-history, are presented. The final paragraph concludes with a short description on how to determine the seismic demand at a specific site, according to the latest design criteria in California. The second chapter deals with the formulation of force-based finite elements and the issues regarding the objectivity of the response in nonlinear field. Both concentrated and distributed plasticity elements are discussed into detail. The third chapter presents the existing structure, the software used OpenSees, and the modeling assumptions and issues. The creation of the nonlinear model represents a central part in this work. Nonlinear material constitutive laws, for concrete and reinforcing steel, are discussed into detail; as well as the different scenarios employed in the columns modeling. Finally, the results of the pushover analysis are presented in chapter four. Capacity curves are examined for the different model scenarios used, and failure modes of concrete and steel are discussed. Capacity curve is converted into capacity spectrum and intersected with the design spectrum. In the last paragraph, the results of nonlinear time-history analyses are compared to those of pushover analysis.
Resumo:
Analysis of the collapse of a precast r.c. industrial building during the 2012 Emilia earthquake, focus on the failure mechanisms in particular on the flexure-shear interactions. Analysis performed by a time history analysis using a FEM model with the software SAP2000. Finally a reconstruction of the collapse on the basis of the numerical data coming from the strength capacity of the elements failed, using formulation for lightly reinforced columns with high shear and bending moment.
Resumo:
This thesis work aims to find a procedure for isolating specific features of the current signal from a plasma focus for medical applications. The structure of the current signal inside a plasma focus is exclusive of this class of machines and a specific analysis procedure has to be developed. The hope is to find one or more features that shows a correlation with the dose erogated. The study of the correlation between the current discharge signal and the dose delivered by a plasma focus could be of some importance not only for the practical application of dose prediction but also for expanding the knowledge anbout the plasma focus physics. Vatious classes of time-frequency analysis tecniques are implemented in order to solve the problem.
Resumo:
The discovery of the neutrino mass is a direct evidence of new physics. Several questions arise from this observation, regarding the mechanism originating the neutrino masses and their hierarchy, the violation of lepton number conservation and the generation of the baryon asymmetry. These questions can be addressed by the experimental search for neutrinoless double beta (0\nu\beta\beta) decay, a nuclear decay consisting of two simultaneous beta emissions without the emission of two antineutrinos. 0\nu\beta\beta decay is possible only if neutrinos are identical to antineutrinos, namely if they are Majorana particles. Several experiments are searching for 0\nu\beta\beta decay. Among these, CUORE is employing 130Te embedded in TeO_2 bolometric crystals. It needs to have an accurate understanding of the background contribution in the energy region around the Q-value of 130Te. One of the main contributions is given by particles from the decay chains of contaminating nuclei (232Th, 235-238U) present in the active crystals or in the support structure. This thesis uses the 1 ton yr CUORE data to study these contamination by looking for events belonging to sub-chains of the Th and U decay chains and reconstructing their energy and time difference distributions in a delayed coincidence analysis. These results in combination with studies on the simulated data are then used to evaluate the contaminations. This is the first time this analysis is applied to the CUORE data and this thesis highlights the feasibility of it while providing a starting point for further studies. A part of the obtained results agrees with ones from previous analysis, demonstrating that delayed coincidence searches might improve the understanding of the CUORE experiment background. This kind of delayed coincidence analysis can also be reused in the future once the, CUORE upgrade, CUPID data will be ready to be analyzed, with the aim of improving the sensitivity to the 0\nu\beta\beta decay of 100Mo.
Resumo:
PhEDEx, the CMS transfer management system, during the first LHC Run has moved about 150 PB and currently it is moving about 2.5 PB of data per week over the Worldwide LHC Computing Grid (WLGC). It was designed to complete each transfer required by users at the expense of the waiting time necessary for its completion. For this reason, after several years of operations, data regarding transfer latencies has been collected and stored into log files containing useful analyzable informations. Then, starting from the analysis of several typical CMS transfer workflows, a categorization of such latencies has been made with a focus on the different factors that contribute to the transfer completion time. The analysis presented in this thesis will provide the necessary information for equipping PhEDEx in the future with a set of new tools in order to proactively identify and fix any latency issues. PhEDEx, il sistema di gestione dei trasferimenti di CMS, durante il primo Run di LHC ha trasferito all’incirca 150 PB ed attualmente trasferisce circa 2.5 PB di dati alla settimana attraverso la Worldwide LHC Computing Grid (WLCG). Questo sistema è stato progettato per completare ogni trasferimento richiesto dall’utente a spese del tempo necessario per il suo completamento. Dopo svariati anni di operazioni con tale strumento, sono stati raccolti dati relativi alle latenze di trasferimento ed immagazzinati in log files contenenti informazioni utili per l’analisi. A questo punto, partendo dall’analisi di una ampia mole di trasferimenti in CMS, è stata effettuata una suddivisione di queste latenze ponendo particolare attenzione nei confronti dei fattori che contribuiscono al tempo di completamento del trasferimento. L’analisi presentata in questa tesi permetterà di equipaggiare PhEDEx con un insieme di utili strumenti in modo tale da identificare proattivamente queste latenze e adottare le opportune tattiche per minimizzare l’impatto sugli utenti finali.
Resumo:
One of the main process features under study in Cognitive Translation & Interpreting Studies (CTIS) is the chronological unfolding of the tasks. The analyses of time spans in translation have been conceived in two ways: (1) studying those falling between text units of different sizes: words, phrases, sentences, and paragraphs; (2) setting arbitrary time span thresholds to explore where do they fall in the text, whether between text units or not. Writing disfluencies may lead to comprehensive insights into the cognitive activities involved in typing while translating. Indeed, long time spans are often taken as hints that cognitive resources have been subtracted from typing and devoted to other activities, such as planning, evaluating, etc. This exploratory, pilot study combined both approaches to seek potential general tendencies and contrasts in informants’ inferred mental processes when performing different writing tasks, through the analysis of their behaviors, as keylogged. The study tasks were retyping, monolingual free writing, translation, revision and a multimodal task—namely, monolingual text production based on an infographic leaflet. Task logs were chunked, and shorter time spans, including those within words, were analyzed following the Task Segment Framework (Muñoz & Apfelthaler, in press). Finally, time span analysis was combined with the analysis of the texts as to their lexical density, type-token ratio and word frequency. Several previous results were confirmed, and some others were surprising. Time spans in free writing were longer between paragraphs and sentences, possibly hinting at planning and, in translation, between clauses and words, suggesting more cognitive activities at these levels. On the other hand, the infographic was expected to facilitate the writing process, but most time spans were longer than in both free writing and translation. Results of the multimodal task and some other results suggest venues for further research.
Resumo:
The purpose of this work was to investigate possible patterns occurring in the sewage bacterial content of four cities (Bologna, Budapest, Rome, Rotterdam) over time (March 2020 - November 2021), also considering the possible effects of the lockdown periods due to the COVID-19 pandemic. The sewage metagenomics data were provided within VEO (Versatile Emerging infectious disease Observatory) project. The first analysis was the evaluation of the between samples diversity, looking for (dis)similarities among the cities, as well as among different time periods (seasonality). To this aim, we computed both similarity networks and Principal Coordinate Analysis (PCoA) plots based on the Bray-Curtis metric. Then, the alpha-biodiversity of the samples was estimated by means of different diversity indices. By looking at the temporal behaviour of the biodiversity in the four cities, we noticed an abrupt decrease in both Rome and Budapest in the Summer of 2020, that is related to: the prevalence of some species when the minimum occurred, and the change in correlations among species (studied via correlation networks), which is enriched in the period of minimum biodiversity. Rotterdam samples seem to be very different with respect to those from the other cities, as confirmed by PCoA. Moreover, the Rotterdam time series is proved to be stable and stationary also in terms of biodiversity. The low variability in the Rotterdam samples seems to be related to the species of Pseudomonas genus, which are highly variable and plentiful in the other cities, but are not among the most abundant in Rotterdam. Also, we observed that no seasonality effect emerged from the time series of the four cities. Regarding the impact of lockdown periods due to the COVID-19 pandemic, from the limited data available no effect on the time series considered emerges. More samples will be soon available and these analyses will be performed also on them, so that the possible effects of lockdowns may be studied.