960 resultados para Antarctic Treaty (1959)
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
Programa de doctorado: Historiografía, fuentes y métodos de la investigación histórica
Resumo:
What exactly is tax treaty override ? When is it realized ? This thesis, which is the result of a co-directed PhD between the University of Bologna and Tilburg University, gives a deep insight into a topic that has not yet been analyzed in a systematic way. On the contrary, the analysis about tax treaty override is still at a preliminary stage. For this reason the origin and nature of tax treaty override are first of all analyzed in their ‘natural’ context, i.e. within general international law. In order to characterize tax treaty override and deeply understand its peculiarities the evaluation of the effects of general international law on tax treaties based on the OECD Model Convention is a necessary pre-condition. Therefore, the binding effects of an international agreement on state sovereignty are specifically investigated. Afterwards, the interpretation of the OECD Model Convention occupies the main part of the thesis in order to develop an ‘interpretative model’ which can be applied every time a case of tax treaty override needs to be detected. Fictitious income, exit taxes and CFC regimes are analyzed in order to verify their compliance with tax treaties based on the OECD Model Convention and establish when the relevant legislation realizes cases of tax treaty override.
Resumo:
The Treaty of Lisbon has brought remarkable changes and innovations to the European Union. As far as the Council of Ministers of the European Union (“the Council” hereinafter) is concerned, there are two significant innovations: double qualified majority voting and new rotating Presidency scheme, which are considered to make the working of the Council more efficiently, stably and consistently. With the modification relating to other key institutions, the Commission and the European Parliament, and with certain procedures being re-codified, the power of the Council varies accordingly, where the inter-institutional balance counts for more research. As the Council is one of the co-legislatures of the Union, the legislative function of it would be probably influenced, positively or negatively, by the internal innovations and the inter-institutional re-balance. Has the legislative function of the Council been reinforced or not? How could the Council better reach its functional goal designed by the Treaties’ drafter? How to evaluate the Council’s evolution after Lisbon Treaty in the light of European integration? This thesis is attempting to find the answers by analyzing two main internal innovations and inter-institutional re-balance thereinafter.
Resumo:
L’obiettivo di questo studio è comprendere come si sia evoluto il concetto di bene culturale in Italia nella seconda metà del Novecento. Pertanto si ritiene rilevante l’analisi delle vicende storiche e politiche sulla gestione, valorizzazione e tutela del patrimonio culturale. In particolare si focalizza l’attenzione sullo sviluppo delle politiche pubbliche in Italia tra la fine degli anni Sessanta e la prima metà degli anni Settanta. Un momento che si definisce come un punto cardine del dibattito e delle azioni politiche che prendono avvio, in Italia, nel periodo post-unitario. Passaggi centrali di questo processo si considerano l’istituzione del Ministero per i Beni Culturali e Ambientali e le prime iniziative regionali nel campo della cultura. Ed è proprio nel rapporto tra centro e periferia che emerge una nuova attenzione ai beni culturali e all’elaborazione di politiche in questo campo. Al fine di uno sguardo europeo, nell’evoluzione delle politiche culturali, si considera peculiare il caso francese, con la creazione del Ministero degli Affari Culturali, alla fine degli anni Cinquanta.
Antarctic cloud spectral emission from ground-based measurements, a focus on far infrared signatures
Resumo:
The present work belongs to the PRANA project, the first extensive field campaign of observation of atmospheric emission spectra covering the Far InfraRed spectral region, for more than two years. The principal deployed instrument is REFIR-PAD, a Fourier transform spectrometer used by us to study Antarctic cloud properties. A dataset covering the whole 2013 has been analyzed and, firstly, a selection of good quality spectra is performed, using, as thresholds, radiance values in few chosen spectral regions. These spectra are described in a synthetic way averaging radiances in selected intervals, converting them into BTs and finally considering the differences between each pair of them. A supervised feature selection algorithm is implemented with the purpose to select the features really informative about the presence, the phase and the type of cloud. Hence, training and test sets are collected, by means of Lidar quick-looks. The supervised classification step of the overall monthly datasets is performed using a SVM. On the base of this classification and with the help of Lidar observations, 29 non-precipitating ice cloud case studies are selected. A single spectrum, or at most an average over two or three spectra, is processed by means of the retrieval algorithm RT-RET, exploiting some main IR window channels, in order to extract cloud properties. Retrieved effective radii and optical depths are analyzed, to compare them with literature studies and to evaluate possible seasonal trends. Finally, retrieval output atmospheric profiles are used as inputs for simulations, assuming two different crystal habits, with the aim to examine our ability to reproduce radiances in the FIR. Substantial mis-estimations are found for FIR micro-windows: a high variability is observed in the spectral pattern of simulation deviations from measured spectra and an effort to link these deviations to cloud parameters has been performed.
Resumo:
Surface based measurements systems play a key role in defining the ground truth for climate modeling and satellite product validation. The Italian-French station of Concordia is operative year round since 2005 at Dome C (75°S, 123°E, 3230 m) on the East Antarctic Plateau. A Baseline Surface Radiation Network (BSRN) site was deployed and became operational since January 2006 to measure downwelling components of the radiation budget, and successively was expanded in April 2007 to measure upwelling radiation. Hence, almost a decade of measurement is now available and suitable to define a statistically significant climatology for the radiation budget of Concordia including eventual trends, by specifically assessing the effects of clouds and water vapor on SW and LW net radiation. A well known and robust clear sky-id algorithm (Long and Ackerman, 2000) has been operationally applied on downwelling SW components to identify cloud free events and to fit a parametric equation to determine clear-sky reference along the Antarctic daylight periods (September to April). A new model for surface broadband albedo has been developed in order to better describe the features the area. Then, a novel clear-sky LW parametrization, based on a-priori assumption about inversion layer structure, combined with daily and annual oscillations of the surface temperature, have been adopted and validated. The longwave based method is successively exploited to extend cloud radiative forcing studies to nighttime period (winter). Results indicated inter-annual and intra-annual warming behaviour, i.e. 13.70 W/m2 on the average, specifically approaching neutral effect in summer, when SW CRF compensates LW CRF, and warming along the rest of the year due prevalentely to CRF induced on the LW component.
Resumo:
Ice core evidence indicates that even though atmospheric CO2 concentrations did not exceed ~300 ppm at any point during the last 800 000 years, East Antarctica was at least ~3–4 °C warmer than preindustrial (CO2~280 ppm) in each of the last four interglacials. During the previous three interglacials, this anomalous warming was short lived (~3000 years) and apparently occurred before the completion of Northern Hemisphere deglaciation. Hereafter, we refer to these periods as "Warmer than Present Transients" (WPTs). We present a series of experiments to investigate the impact of deglacial meltwater on the Atlantic Meridional Overturning Circulation (AMOC) and Antarctic temperature. It is well known that a slowed AMOC would increase southern sea surface temperature (SST) through the bipolar seesaw and observational data suggests that the AMOC remained weak throughout the terminations preceding WPTs, strengthening rapidly at a time which coincides closely with peak Antarctic temperature. We present two 800 kyr transient simulations using the Intermediate Complexity model GENIE-1 which demonstrate that meltwater forcing generates transient southern warming that is consistent with the timing of WPTs, but is not sufficient (in this single parameterisation) to reproduce the magnitude of observed warmth. In order to investigate model and boundary condition uncertainty, we present three ensembles of transient GENIE-1 simulations across Termination II (135 000 to 124 000 BP) and three snapshot HadCM3 simulations at 130 000 BP. Only with consideration of the possible feedback of West Antarctic Ice Sheet (WAIS) retreat does it become possible to simulate the magnitude of observed warming.