960 resultados para Geophysical instruments
Resumo:
Subduction zones are the favorite places to generate tsunamigenic earthquakes, where friction between oceanic and continental plates causes the occurrence of a strong seismicity. The topics and the methodologies discussed in this thesis are focussed to the understanding of the rupture process of the seismic sources of great earthquakes that generate tsunamis. The tsunamigenesis is controlled by several kinematical characteristic of the parent earthquake, as the focal mechanism, the depth of the rupture, the slip distribution along the fault area and by the mechanical properties of the source zone. Each of these factors plays a fundamental role in the tsunami generation. Therefore, inferring the source parameters of tsunamigenic earthquakes is crucial to understand the generation of the consequent tsunami and so to mitigate the risk along the coasts. The typical way to proceed when we want to gather information regarding the source process is to have recourse to the inversion of geophysical data that are available. Tsunami data, moreover, are useful to constrain the portion of the fault area that extends offshore, generally close to the trench that, on the contrary, other kinds of data are not able to constrain. In this thesis I have discussed the rupture process of some recent tsunamigenic events, as inferred by means of an inverse method. I have presented the 2003 Tokachi-Oki (Japan) earthquake (Mw 8.1). In this study the slip distribution on the fault has been inferred by inverting tsunami waveform, GPS, and bottom-pressure data. The joint inversion of tsunami and geodetic data has revealed a much better constrain for the slip distribution on the fault rather than the separate inversions of single datasets. Then we have studied the earthquake occurred on 2007 in southern Sumatra (Mw 8.4). By inverting several tsunami waveforms, both in the near and in the far field, we have determined the slip distribution and the mean rupture velocity along the causative fault. Since the largest patch of slip was concentrated on the deepest part of the fault, this is the likely reason for the small tsunami waves that followed the earthquake, pointing out how much the depth of the rupture plays a crucial role in controlling the tsunamigenesis. Finally, we have presented a new rupture model for the great 2004 Sumatra earthquake (Mw 9.2). We have performed the joint inversion of tsunami waveform, GPS and satellite altimetry data, to infer the slip distribution, the slip direction, and the rupture velocity on the fault. Furthermore, in this work we have presented a novel method to estimate, in a self-consistent way, the average rigidity of the source zone. The estimation of the source zone rigidity is important since it may play a significant role in the tsunami generation and, particularly for slow earthquakes, a low rigidity value is sometimes necessary to explain how a relatively low seismic moment earthquake may generate significant tsunamis; this latter point may be relevant for explaining the mechanics of the tsunami earthquakes, one of the open issues in present day seismology. The investigation of these tsunamigenic earthquakes has underlined the importance to use a joint inversion of different geophysical data to determine the rupture characteristics. The results shown here have important implications for the implementation of new tsunami warning systems – particularly in the near-field – the improvement of the current ones, and furthermore for the planning of the inundation maps for tsunami-hazard assessment along the coastal area.
Resumo:
The theory of the 3D multipole probability tomography method (3D GPT) to image source poles, dipoles, quadrupoles and octopoles, of a geophysical vector or scalar field dataset is developed. A geophysical dataset is assumed to be the response of an aggregation of poles, dipoles, quadrupoles and octopoles. These physical sources are used to reconstruct without a priori assumptions the most probable position and shape of the true geophysical buried sources, by determining the location of their centres and critical points of their boundaries, as corners, wedges and vertices. This theory, then, is adapted to the geoelectrical, gravity and self potential methods. A few synthetic examples using simple geometries and three field examples are discussed in order to demonstrate the notably enhanced resolution power of the new approach. At first, the application to a field example related to a dipole–dipole geoelectrical survey carried out in the archaeological park of Pompei is presented. The survey was finalised to recognize remains of the ancient Roman urban network including roads, squares and buildings, which were buried under the thick pyroclastic cover fallen during the 79 AD Vesuvius eruption. The revealed anomaly structures are ascribed to wellpreserved remnants of some aligned walls of Roman edifices, buried and partially destroyed by the 79 AD Vesuvius pyroclastic fall. Then, a field example related to a gravity survey carried out in the volcanic area of Mount Etna (Sicily, Italy) is presented, aimed at imaging as accurately as possible the differential mass density structure within the first few km of depth inside the volcanic apparatus. An assemblage of vertical prismatic blocks appears to be the most probable gravity model of the Etna apparatus within the first 5 km of depth below sea level. Finally, an experimental SP dataset collected in the Mt. Somma-Vesuvius volcanic district (Naples, Italy) is elaborated in order to define location and shape of the sources of two SP anomalies of opposite sign detected in the northwestern sector of the surveyed area. The modelled sources are interpreted as the polarization state induced by an intense hydrothermal convective flow mechanism within the volcanic apparatus, from the free surface down to about 3 km of depth b.s.l..
Resumo:
The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.
Resumo:
Gli strumenti chirurgici sono importanti “devices” utilizzati come supporto indi-spensabile nella cura di pazienti negli ospedali. Essi sono caratterizzati da un intero ciclo di vita che inizia convenzionalmente nello “Store”, dove gli strumenti sterilizzati sono prelevati per essere utilizzati all’interno delle sale operatorie, e termina nuovamente nello “Store”, dove gli strumenti vengono immagazzinati per essere riutilizzati in un nuovo ciclo. Può accadere che le singole fasi del ciclo subiscano ritardi rispetto ai tempi previ-sti, non assicurando, pertanto, nelle sale operatorie, il corretto numero degli stru-menti secondo i tempi programmati. Il progetto che vado ad illustrare ha come obiettivo l’ottimizzazione del ciclo degli strumenti chirurgici all’interno di un nuovo ospedale, applicando i principi della Lean philosophy ed in particolare i metodi: “Poke Yoke, 5S e tracciabilità”. Per raggiungere tale scopo, il progetto è stato articolato come segue. In un primo momento si è osservato l’intero ciclo di vita degli strumenti nei due principali ospedali di Copenhagen (Hervel e Gentofte hospital). Ciò ha permesso di rilevare gli steps del ciclo, nonché di riscontrare sul campo i principali problemi relativi al ciclo stesso quali: bassa flessiblità, decentramento dei differenti reparti di cleaning e di store rispetto alle operation theatres ed un problema nel solleva-mento degli strumenti pesanti. Raccolte le dovute informazioni, si è passati alla fase sperimentale, in cui sono stati mappati due cicli di vita differenti, utilizzando tre strumenti di analisi: • Idef0 che consente di avere una visione gerarchica del ciclo; • Value stream Mapping che permette di evidenziare i principali sprechi del ciclo; • Simulator Tecnomatix che favorisce un punto di vista dinamico dell’analisi. Il primo ciclo mappato è stato creato con il solo scopo di mettere in risalto gli steps del ciclo e alcuni problemi rincontrati all’interno degli ospedali visitati. Il secondo ciclo, invece, è stato creato in ottica Lean al fine di risolvere alcuni tra i principali problemi riscontrati nei due ospedali e ottimizzare il primo ciclo. Si ricordi, infatti, che nel secondo ciclo le principali innovazioni introdotte sono state: l’utilizzo del Barcode e Rfid Tag per identificare e tracciare la posizione degli items, l’uso di un “Automatic and Retrievial Store” per minimizzare i tempi di inserimento e prelievo degli items e infine l’utilizzo di tre tipologie di carrello, per consentire un flessibile servizio di cura. Inoltre sono state proposte delle solu-zioni “Poke-Yoke” per risolvere alcuni problemi manuali degli ospedali. Per evidenziare il vantaggio del secondo ciclo di strumenti, è stato preso in consi-derazione il parametro “Lead time”e le due simulazioni, precedentemente create, sono state confrontate. Tale confronto ha evidenziato una radicale riduzione dei tempi (nonché dei costi associati) della nuova soluzione rispetto alla prima. Alla presente segue la trattazione in lingua inglese degli argomenti oggetto di ri-cerca. Buona lettura.
Resumo:
Basic concepts and definitions relative to Lagrangian Particle Dispersion Models (LPDMs)for the description of turbulent dispersion are introduced. The study focusses on LPDMs that use as input, for the large scale motion, fields produced by Eulerian models, with the small scale motions described by Lagrangian Stochastic Models (LSMs). The data of two different dynamical model have been used: a Large Eddy Simulation (LES) and a General Circulation Model (GCM). After reviewing the small scale closure adopted by the Eulerian model, the development and implementation of appropriate LSMs is outlined. The basic requirement of every LPDM used in this work is its fullfillment of the Well Mixed Condition (WMC). For the dispersion description in the GCM domain, a stochastic model of Markov order 0, consistent with the eddy-viscosity closure of the dynamical model, is implemented. A LSM of Markov order 1, more suitable for shorter timescales, has been implemented for the description of the unresolved motion of the LES fields. Different assumptions on the small scale correlation time are made. Tests of the LSM on GCM fields suggest that the use of an interpolation algorithm able to maintain an analytical consistency between the diffusion coefficient and its derivative is mandatory if the model has to satisfy the WMC. Also a dynamical time step selection scheme based on the diffusion coefficient shape is introduced, and the criteria for the integration step selection are discussed. Absolute and relative dispersion experiments are made with various unresolved motion settings for the LSM on LES data, and the results are compared with laboratory data. The study shows that the unresolved turbulence parameterization has a negligible influence on the absolute dispersion, while it affects the contribution of the relative dispersion and meandering to absolute dispersion, as well as the Lagrangian correlation.
Resumo:
We noninvasively detected the characteristics and location of a regional fault in an area of poor bedrock exposure complicated by karst weathering features in the subsurface. Because this regional fault is associated with sinkhole formation, its location is important for hazard avoidance. The bedrock lithologies on either side of the fault trace are similar; hence, we chose an approach that capitalized on the complementary strengths of very low frequency (VLF) electromagnetic, resistivity, and gravity methods. VLF proved most useful as a first-order reconnaissance tool, allowing us to define a narrow target area for further geophysical exploration. Fault-related epikarst was delineated using resistivity. Ultimately, a high-resolution gravity survey and subsequent inverse modeling using the results of the resistivity survey helped to further constrain the location and approximate orientation of the fault. The combined results indicated that the location of the fault trace needed to be adjusted 53 m south of the current published location and was consistent with a north-dipping thrust fault. Additionally, a gravity low south of the fault trace agreed with the location of conductive material from the resistivity and VLF surveys. We interpreted these anomalies to represent enhanced epikarst in the fault footwall. We clearly found that a staged approach involving a progression of methods beginning with a reconnaissance VLF survey, followed by high-resolution gravity and electrical resistivity surveys, can be used to characterize a fault and fault-related karst in an area of poor bedrock surface exposure.
Resumo:
The rise of new food assistance instruments, including local and regional procurement, cash, and vouchers, has surpassed increase in understanding of the tradeoffs among and impacts of these options relative to traditional food aid. Response choices rarely appear to result from systematic response analyses. Further, impacts along multiple dimensions-timeliness, cost-effectiveness, local market effects, recipient satisfaction, food quality, impact on smallholder suppliers, etc.-may be competing or synergistic. No single food assistance tool is always and everywhere preferable. A growing body of evidence, including the papers in this special section, nonetheless demonstrates the clear value-added of new food assistance instruments. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this research project is to continue exploring the Montandon Long-Term Hydrologic Research Site(LTHR) by using multiple geophysical methods to obtain more accurate and precise information regarding subsurface hydrologic properties of a local gravel ridge,which are important to both the health of surrounding ecosystems and local agriculture. Through using non-invasive geophysical methods such as seismic refraction, Direct Current resistivity and ground penetrating radar (GPR) instead of invasive methods such as boreholedrilling which displace sediment and may alter water flow, data collection is less likely to bias the data itself. In addition to imaging the gravel ridge subsurface, another important researchpurpose is to observe how both water table elevation and the moisture gradient (moisture content of the unsaturated zone) change over a seasonal time period and directly after storm events. The combination of three types of data collection allows the strengths of each method combine together and provide a relatively strongly supported conclusions compared to previous research. Precipitation and geophysical data suggest that an overall increase in precipitation during the summer months causes a sharp decrease in subsurface resistivity within the unsaturated zone. GPR velocity data indicate significant immediate increase in moisture content within the shallow vadose zone (< 1m), suggesting that rain water was infiltrating into the shallow subsurface. Furthermore, the combination of resistivity and GPR results suggest that the decreased resistivity within the shallow layers is due to increased ion content within groundwater. This is unexpected as rainwater is assumed to have a DC resistivity value of 3.33*105 ohm-m. These results may suggest that ions within the sediment must beincorporated into the infiltrating water.
Resumo:
Water held in the unsaturated zone is important for agriculture and construction and is replenished by infiltrating rainwater. Monitoring the soil water content of clay soils using ground-penetrating radar (GPR) has not been researched, as clay soils cause attenuation of GPR signal. In this study, GPR common-midpoint soundings (CMPs) are used in the clayey soils of the Miller Run floodplain to monitor changes in the soil water content (SWC) before and after rainfall events. GPR accomplishes this task because increases in water content will increase the dielectric constant of the subsurface material, and decrease the velocity of the GPR wave. Using an empirical relationship between dielectric constant and SWC, the Topp relation, we are able to calculate a SWC from these velocity measurements. Non-invasive electromagnetics, resistivity, and seismic were performed, and from these surveys, the layering at the field site was delineated. EM characterized the horizontal variation of the soil, allowing us to target the most clay rich area. At the CMP location, resistivity indicates the vertical structure of the subsurface consists of a 40 cm thick layer with a resistivity of 100 ohm*m. Between 40 cm and 1.5 m is a layer with a resistivity of 40 ohm*m. The thickness estimates were confirmed with invasive auger and trenching methods away from the CMP location. GPR CMPs were collected relative to a July 2013 and September 2013 storm. The velocity observations from the CMPs had a precision of +/- 0.001 m/ns as assessed by repeat analysis. In the case of both storms, the GPR data showed the expected relationship between the rainstorms and calculated SWC, with the SWC increasing sharply after the rainstorm and decreasing as time passed. We compared these data to auger core samples collected at the same time as the CMPs were taken, and the volumetric analysis of the cores confirmed the trend seen in the GPR, with SWC values between 3 and 5 percent lower than the GPR estimates. Our data shows that we can, with good precision, monitor changes in the SWC of conductive soils in response to rainfall events, despite the attenuation induced by the clay.
Resumo:
Icy debris fans have are newly-described landforms (Kochel and Trop, 2008 and 2012) as landforms developed immediately after deglaciation on Earth and similar features have been observed on Mars. Subsurface characteristics of Icy debris fans have not been previously investigated. Ground penetrating radar (GPR) was used to non-invasively investigate the subsurface characteristics of icy debris fans near McCarthy, Alaska, USA. The three fans investigated in Alaska are the East, West, and Middle fans (Kochel and Trop, 2008 and 2012) which below the Nabesna ice cap and on top of the McCarthy Creek Glacier. Icy debris fans in general are a largely unexplored suite of paraglacial landforms and processes in alpine regions. Recent field studies focused on direct observations and depositional processes. Their results showed that the fan's composition is primarily influenced by the type and frequency of depositional processes that supply the fan. Photographic studies show that the East Fan receives far more ice and snow avalanches whereas the Middle and West Fans receive fewer mass wasting events but more clastic debris is deposited on the Middle and West fan from rock falls and icy debris flows. GPR profiles and Wide-angle reflection and refraction (WARR) surveys consisting of both, common mid-point (CMP), and common shot-point (CSP) surveys investigated the subsurface geometry of the fans and the McCarthy Creek Glacier. All GPR surveys were collected in July of 2013 with 100MHz bi-static antennas. Four axial profiles and three cross-fan profiles were done on the West and Middle fans as well as the McCarthy Creek Glacier in order to investigate the relationship between the three features. GPR profiles yielded reflectors that were continuous for 10+ m and hyperbolic reflections in the subsurface. The depth to these reflections in the subsurface requires knowledge of the velocity of the subsurface. To find the velocity of the subsurface eight WARR surveys collected on the fans and on the McCarthy Creek glacier to provide information on variability of subsurface velocities. The profiles of the Middle and West fan have more reflections in their profiles compared to profiles done on the McCarthy Creek Glacier. Based on the WARR surveys, we interpret the lower energy return in the glacier to be caused by two reasons. 1) The increased attenuation due to wet ice versus drier ice and on the fan with GPR velocities >0.15m/ns. 2) Lack of interfaces in the glacier compared to those in the fans which are inferred to be produced by the alternating layers of stratified ice and lithic-rich layers. The GPR profiles on the West and Middle Fans show the shallow subsurface being dominated by lenticular reflections interpreted to be consistent with the shape of surficial deposits. The West Fan is distinguished from the Middle Fan by the nature of its reflections patterns and thicknesses of reflection packages that clearly shows the Middle fan with a greater thickness. The changes in subsurface reflections between the Middle and West Fans as well as the McCarthy Creek Glacier are thought to reflect the type and frequency of depositional processes and surrounding bedrock and talus slopes.
Resumo:
A literature review of the most widely used condition specific, self administered assessment questionnaires for low back pain had been undertaken. General and historic aspects, reliability, responsiveness and minimum clinically important difference, external validity, floor and ceiling effects, and available languages were analysed. These criteria, however, are only part of the consideration. Of similar importance are the content, wording of questions and answers in each of the six questionnaires and an analysis of the different score results. The issue of score bias is discussed and suggestions are given in order to increase the construct validity in the practical use of the individual questionnaires.