1000 resultados para Università di Cagliari. Biblioteca


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il Lavoro si inserisce nel quadro complesso del settore Energy & Utilities e si propone l’obiettivo di analizzare l’attuale mercato dell’energia per individuarne i driver al cambiamento e presentare un innovativo modello di business per le aziende di vendita di energia, con lo scopo di recuperare efficienza nella gestione del Cliente finale, cercando di quantificarne i vantaggi potenziali. L’attività di studio e progettuale è stata svolta nell’ambito di un periodo di tirocinio formativo della durata di sei mesi, effettuato presso Engineering Ingegneria Informatica S.p.A., in particolare nella sede di viale Masini di Bologna, a seguito della candidatura autonoma dello studente e del suo immediato inserimento nei processi di business della divisione Utilities dell’azienda. Il Lavoro si suddivide in 9 capitoli: dopo una breve introduzione sul settore Energy&Utilities, nei primi quattro capitoli sono descritte le filiere produttive dei principali servizi, i principali attori del mercato e gli aspetti normativi e tariffari che caratterizzano l’intero settore, valutando in particolare la formazione del prezzo del gas e dell’energia elettrica. I capitoli cinque e sei descrivono invece le principali tendenze, le strategie competitive in atto nel mercato delle Utilities e l’importanza del Cliente, in un’ottica di CRM che segue i dettami del modello “Customer Centric”. Gli ultimi capitoli mostrano invece, dopo una breve presentazione dell’azienda in cui lo studente ha svolto l’attività, l’intero lavoro di analisi realizzato, input del modello di business a chiusura del Lavoro, volto a quantificare gli impatti del processo di liberalizzazione che ha radicalmente modificato il settore delle Utilities negli ultimi anni, valutando a proposito la profittabilità per un cliente medio in base ad un’opportuna pre-analisi di segmentazione. Il modello di business che occupa l’ultimo capitolo costituisce una soluzione originale e innovativa per incrementare tale profittabilità.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mappatura dei processo organizzativi, della struttura organizzativa e dei sistemi informativi di supporto. analisi di alcune problematiche riscontrate

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il seguente elaborato è la diretta conseguenza di un periodo di stage, pari a cinque mesi, svolto presso l’azienda INTERTABA S.p.A., a Zola Predosa (BO). Il Dipartimento presso cui è stato svolto il tirocinio è l’Engineering. In particolare è stata compiuta un’analisi dei KPIs presenti e sono state proposte delle azioni migliorative. Il lavoro si è sviluppato in tre fasi. Nella prima fase è stata fatta un’analisi dei KPIs attuali per ciascuna funzione appartenente all’Engineering: Engineering Support, Project Engineering, Plant & Utilities Maintenance, Technical Warehouse e General Services. Nella seconda fase sono state descritte, per ciascuna funzione, alcune proposte migliorative per i KPIs presenti. Infine, per alcune funzioni, sono state proposte alcune iniziative in merito all’implementazione di nuovi KPIs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to protect river water quality, highly affected in urban areas by continuos as intermittent immissions, it is necessary to adopt measures to intercept and treat these polluted flows. In particular during rain events, river water quality is affected by CSOs activation. Built in order to protect the sewer system and the WWTP by increased flows due to heavy rains, CSOs divert excess flows to the receiving water body. On the basis of several scientific papers, and of direct evidences as well, that demonstrate the detrimental effect of CSOs discharges, also the legislative framework moved towards a stream standard point of view. The WFD (EU/69/2000) sets new goals for receiving water quality, and groundwater as well, through an integrated immission/emissions phylosophy, in which emission limits are associated with effluent standards, based on the receiving water characteristics and their specific use. For surface waters the objective is that of a “good” ecological and chemical quality status. A surface water is defined as of good ecological quality if there is only slight departure from the biological community that would be expected in conditions of minimal anthropogenic impact. Each Member State authority is responsible for preparing and implementing a River Basin Management Plan to achieve the good ecological quality, and comply with WFD requirements. In order to cope with WFD targets, and thus to improve urban receiving water quality, a CSOs control strategy need to be implemented. Temporarily storing the overflow (or at least part of it) into tanks and treating it in the WWTP, after the end of the storm, showed good results in reducing total pollutant mass spilled into the receiving river. Italian State Authority, in order to comply with WFD statements, sets general framework, and each Region has to adopt a Water Remediation Plan (PTA, Piano Tutela Acque), setting goals, methods, and terms, to improve river water quality. Emilia Romagna PTA sets 25% reduction up to 2008, and 50% reduction up to 2015 fo total pollutants masses delivered by CSOs spills. In order to plan remediation actions, a deep insight into spills dynamics is thus of great importance. The present thesis tries to understand spills dynamics through a numerical and an experimental approach. A four months monitoring and sampling campaign was set on the Bologna sewer network, and on the Navile Channel, that is the WWTP receiving water , and that receives flows from up to 28 CSOs during rain events. On the other hand, the full model of the sewer network, was build with the commercial software InfoWorks CS. The model was either calibrated with the data from the monitoring and sampling campaign. Through further model simulations interdependencies among masses spilled, rain characteristics and basin characteristics are looked for. The thesis can be seen as a basis for further insighs and for planning remediation actions.