945 resultados para Interpreting geophysical logs


Relevância:

20.00% 20.00%

Publicador:

Resumo:

While most of this Special Issue is devoted to the testis (which is where most drug and chemically induced toxicity of the male reproductive tract is identified), being able to recognize and understand the potential effects of toxicants on the epididymis is immensely important and an area that is often overlooked. The epididymis is the organ where the post-testicular sperm differentiation occurs, through a complex and still not completely understood sperm maturation process, allowing them to fertilize the oocyte. Also in the epididymis, sperm are stored until ejaculation, while being protected from immunogenic reaction by a blood-epididymis barrier. From a toxicologic perspective the epididymis is inherently complicated as its structure and function can be altered both indirectly and directly. In this review we will discuss the factors that must be considered when attempting to distinguish between indirect and direct epididymal toxicity and highlight what is currently known about mechanisms of epididymal toxicants, using the rat as a reference model. We identify 2 distinguishable signature lesions - one representing androgen deprivation (secondary to Leydig cell toxicity in the testis) and another representing a direct acting toxicant. Other commonly observed alterations will also be shown and discussed. Finally, we point out that many of the key functions of the epididymis can be altered in the absence of a detectable change in tissue structure. Collectively, we hope this will provide pathologists with increased confidence in identification of epididymal toxicity and enable more informed guidance as mechanism of action is considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Bernoulli's model for vibration of beams is often used to make predictions of bending modulus of elasticity when using dynamic tests. However this model ignores the rotary inertia and shear. Such effects can be added to the solution of Bernoulli's equation by means of the correction proposed by Goens (1931) or by Timoshenko (1953). But to apply these corrections it is necessary to know the E/G ratio of the material. The objective of this paper is the determination of the E/G ratio of wood logs by adjusting the analytical solution of the Timoshenko beam model to the dynamic testing data of 20 Eucalyptus citriodora logs. The dynamic testing was performed with the logs in free-free suspension. To find the stiffness properties of the logs, the residue minimization was carried out using the Genetic Algorithm (GA). From the result analysis one can reasonably assume E/G = 20 for wood logs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forward modeling is commonly applied to gravity field data of impact structures to determine the main gravity anomaly sources. In this context, we have developed 2.5-D gravity models of the Serra da Cangalha impact structure for the purpose of investigating geological bodies/structures underneath the crater. Interpretation of the models was supported by ground magnetic data acquired along profiles, as well as by high resolution aeromagnetic data. Ground magnetic data reveal the presence of short-wavelength anomalies probably related to shallow magnetic sources that could have been emplaced during the cratering process. Aeromagnetic data show that the basement underneath the crater occurs at an average depth of about 1.9 km, whereas in the region beneath the central uplift it is raised to 0.51 km below the current surface. These depths are also supported by 2.5-D gravity models showing a gentle relief for the basement beneath the central uplift area. Geophysical data were used to provide further constraints for numeral modeling of crater formation that provided important information on the structural modification that affected the rocks underneath the crater, as well as on shock-induced modifications of target rocks. The results showed that the morphology is consistent with the current observations of the crater and that Serra da Cangalha was formed by a meteorite of approximately 1.4 km diameter striking at 12 km s-1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: To use published literature and experts' opinion to investigate the clinical meaning and magnitude of changes in the Quality of Life (QOL) of groups of patients measured with the European Organisation for the Research and Treatment of Cancer Quality of Life Questionnaire Core 30 (EORTC QLQ-C30). Methods: An innovative method combining systematic review of published studies, expert opinions and meta-analysis was used to estimate large, medium, and small mean changes over time for QLQ-C30 scores. Results: Nine hundred and eleven papers were identified, leading to 118 relevant papers. One thousand two hundred and thirty two mean changes in QOL over time were combined in the meta-analysis, with timescales ranging from four days to five years. Guidelines were produced for trivial, small, and medium size classes, for each subscale and for improving and declining scores separately. Estimates for improvements were smaller than respective estimates for declines. Conclusions: These guidelines can be used to aid sample size calculations and interpretation of mean changes over time from groups of patients. Observed mean changes in the QLQ-C30 scores are generally small in most clinical situations, possibly due to response shift. Careful consideration is needed when planning studies where QOL changes over time are of primary interest; the timing of follow up, sample attrition, direction of QOL changes, and subscales of primary interest are key considerations. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intense phytoplankton blooms were observed along the Patagonian shelf-break with satellite ocean color data, but few in situ optical observations were made in that region. We examine the variability of phytoplankton absorption and particulate scattering coefficients during such blooms on the basis of field data. The chlorophyll-a concentration, [Chla], ranged from 0.1 to 22.3 mg m−3 in surface waters. The size fractionation of [Chla] showed that 80% of samples were dominated by nanophytoplankton (N-group) and 20% by microphytoplankton (M-group). Chlorophyll-specific phytoplankton absorption coefficients at 440 and 676 nm, a*ph(440) and a*ph(676), and particulate scattering coefficient at 660 nm, b*p(660), ranged from 0.018 to 0.173, 0.009 to 0.046, and 0.031 to 2.37 m2 (mg Chla)−1, respectively. Both a*ph(440) and a*ph(676) were statistically higher for the N-group than M-group and also considerably higher than expected from global trends as a function of [Chla]. This result suggests that size of phytoplankton cells in Patagonian waters tends to be smaller than in other regions at similar [Chla]. The phytoplankton cell size parameter, Sf, derived from phytoplankton absorption spectra, proved to be useful for interpreting the variability in the data around the general inverse dependence of a*ph(440), a*ph(676), and b*p(660) on [Chla]. Sf also showed a pattern along the increasing trend of a*ph(440) and a*ph(676) as a function of the ratios of some accessory pigments to [Chla]. Our results suggest that the variability in phytoplankton absorption and scattering coefficients in Patagonian waters is caused primarily by changes in the dominant phytoplankton cell size accompanied by covariation in the concentrations of accessory pigments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] The editorial and review processes along the road to publication are described in general terms. The construction of a well-prepared article and the manner in which authors may maximise the chances of success at each stage of the process towards final publication are explored. The most common errors and ways of avoiding them are outlined. Typical problems facing an author writing in English as a second language, including the need for grammatical precision and appropriate style, are discussed. Additionally, the meaning of plagiarism, self-plagiarism and duplicate publication is explored. Critical steps in manuscript preparation and response to reviews are examined. Finally, the relation between writing and reviewing is outlined, and it is shown how becoming a good reviewer helps in becoming a successful author

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subduction zones are the favorite places to generate tsunamigenic earthquakes, where friction between oceanic and continental plates causes the occurrence of a strong seismicity. The topics and the methodologies discussed in this thesis are focussed to the understanding of the rupture process of the seismic sources of great earthquakes that generate tsunamis. The tsunamigenesis is controlled by several kinematical characteristic of the parent earthquake, as the focal mechanism, the depth of the rupture, the slip distribution along the fault area and by the mechanical properties of the source zone. Each of these factors plays a fundamental role in the tsunami generation. Therefore, inferring the source parameters of tsunamigenic earthquakes is crucial to understand the generation of the consequent tsunami and so to mitigate the risk along the coasts. The typical way to proceed when we want to gather information regarding the source process is to have recourse to the inversion of geophysical data that are available. Tsunami data, moreover, are useful to constrain the portion of the fault area that extends offshore, generally close to the trench that, on the contrary, other kinds of data are not able to constrain. In this thesis I have discussed the rupture process of some recent tsunamigenic events, as inferred by means of an inverse method. I have presented the 2003 Tokachi-Oki (Japan) earthquake (Mw 8.1). In this study the slip distribution on the fault has been inferred by inverting tsunami waveform, GPS, and bottom-pressure data. The joint inversion of tsunami and geodetic data has revealed a much better constrain for the slip distribution on the fault rather than the separate inversions of single datasets. Then we have studied the earthquake occurred on 2007 in southern Sumatra (Mw 8.4). By inverting several tsunami waveforms, both in the near and in the far field, we have determined the slip distribution and the mean rupture velocity along the causative fault. Since the largest patch of slip was concentrated on the deepest part of the fault, this is the likely reason for the small tsunami waves that followed the earthquake, pointing out how much the depth of the rupture plays a crucial role in controlling the tsunamigenesis. Finally, we have presented a new rupture model for the great 2004 Sumatra earthquake (Mw 9.2). We have performed the joint inversion of tsunami waveform, GPS and satellite altimetry data, to infer the slip distribution, the slip direction, and the rupture velocity on the fault. Furthermore, in this work we have presented a novel method to estimate, in a self-consistent way, the average rigidity of the source zone. The estimation of the source zone rigidity is important since it may play a significant role in the tsunami generation and, particularly for slow earthquakes, a low rigidity value is sometimes necessary to explain how a relatively low seismic moment earthquake may generate significant tsunamis; this latter point may be relevant for explaining the mechanics of the tsunami earthquakes, one of the open issues in present day seismology. The investigation of these tsunamigenic earthquakes has underlined the importance to use a joint inversion of different geophysical data to determine the rupture characteristics. The results shown here have important implications for the implementation of new tsunami warning systems – particularly in the near-field – the improvement of the current ones, and furthermore for the planning of the inundation maps for tsunami-hazard assessment along the coastal area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of the 3D multipole probability tomography method (3D GPT) to image source poles, dipoles, quadrupoles and octopoles, of a geophysical vector or scalar field dataset is developed. A geophysical dataset is assumed to be the response of an aggregation of poles, dipoles, quadrupoles and octopoles. These physical sources are used to reconstruct without a priori assumptions the most probable position and shape of the true geophysical buried sources, by determining the location of their centres and critical points of their boundaries, as corners, wedges and vertices. This theory, then, is adapted to the geoelectrical, gravity and self potential methods. A few synthetic examples using simple geometries and three field examples are discussed in order to demonstrate the notably enhanced resolution power of the new approach. At first, the application to a field example related to a dipole–dipole geoelectrical survey carried out in the archaeological park of Pompei is presented. The survey was finalised to recognize remains of the ancient Roman urban network including roads, squares and buildings, which were buried under the thick pyroclastic cover fallen during the 79 AD Vesuvius eruption. The revealed anomaly structures are ascribed to wellpreserved remnants of some aligned walls of Roman edifices, buried and partially destroyed by the 79 AD Vesuvius pyroclastic fall. Then, a field example related to a gravity survey carried out in the volcanic area of Mount Etna (Sicily, Italy) is presented, aimed at imaging as accurately as possible the differential mass density structure within the first few km of depth inside the volcanic apparatus. An assemblage of vertical prismatic blocks appears to be the most probable gravity model of the Etna apparatus within the first 5 km of depth below sea level. Finally, an experimental SP dataset collected in the Mt. Somma-Vesuvius volcanic district (Naples, Italy) is elaborated in order to define location and shape of the sources of two SP anomalies of opposite sign detected in the northwestern sector of the surveyed area. The modelled sources are interpreted as the polarization state induced by an intense hydrothermal convective flow mechanism within the volcanic apparatus, from the free surface down to about 3 km of depth b.s.l..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrothermal fluids are a fundamental resource for understanding and monitoring volcanic and non-volcanic systems. This thesis is focused on the study of hydrothermal system through numerical modeling with the geothermal simulator TOUGH2. Several simulations are presented, and geophysical and geochemical observables, arising from fluids circulation, are analyzed in detail throughout the thesis. In a volcanic setting, fluids feeding fumaroles and hot spring may play a key role in the hazard evaluation. The evolution of the fluids circulation is caused by a strong interaction between magmatic and hydrothermal systems. A simultaneous analysis of different geophysical and geochemical observables is a sound approach for interpreting monitored data and to infer a consistent conceptual model. Analyzed observables are ground displacement, gravity changes, electrical conductivity, amount, composition and temperature of the emitted gases at surface, and extent of degassing area. Results highlight the different temporal response of the considered observables, as well as the different radial pattern of variation. However, magnitude, temporal response and radial pattern of these signals depend not only on the evolution of fluid circulation, but a main role is played by the considered rock properties. Numerical simulations highlight differences that arise from the assumption of different permeabilities, for both homogeneous and heterogeneous systems. Rock properties affect hydrothermal fluid circulation, controlling both the range of variation and the temporal evolution of the observable signals. Low temperature fumaroles and low discharge rate may be affected by atmospheric conditions. Detailed parametric simulations were performed, aimed to understand the effects of system properties, such as permeability and gas reservoir overpressure, on diffuse degassing when air temperature and barometric pressure changes are applied to the ground surface. Hydrothermal circulation, however, is not only a characteristic of volcanic system. Hot fluids may be involved in several mankind problems, such as studies on geothermal engineering, nuclear waste propagation in porous medium, and Geological Carbon Sequestration (GCS). The current concept for large-scale GCS is the direct injection of supercritical carbon dioxide into deep geological formations which typically contain brine. Upward displacement of such brine from deep reservoirs driven by pressure increases resulting from carbon dioxide injection may occur through abandoned wells, permeable faults or permeable channels. Brine intrusion into aquifers may degrade groundwater resources. Numerical results show that pressure rise drives dense water up to the conduits, and does not necessarily result in continuous flow. Rather, overpressure leads to new hydrostatic equilibrium if fluids are initially density stratified. If warm and salty fluid does not cool passing through the conduit, an oscillatory solution is then possible. Parameter studies delineate steady-state (static) and oscillatory solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Basic concepts and definitions relative to Lagrangian Particle Dispersion Models (LPDMs)for the description of turbulent dispersion are introduced. The study focusses on LPDMs that use as input, for the large scale motion, fields produced by Eulerian models, with the small scale motions described by Lagrangian Stochastic Models (LSMs). The data of two different dynamical model have been used: a Large Eddy Simulation (LES) and a General Circulation Model (GCM). After reviewing the small scale closure adopted by the Eulerian model, the development and implementation of appropriate LSMs is outlined. The basic requirement of every LPDM used in this work is its fullfillment of the Well Mixed Condition (WMC). For the dispersion description in the GCM domain, a stochastic model of Markov order 0, consistent with the eddy-viscosity closure of the dynamical model, is implemented. A LSM of Markov order 1, more suitable for shorter timescales, has been implemented for the description of the unresolved motion of the LES fields. Different assumptions on the small scale correlation time are made. Tests of the LSM on GCM fields suggest that the use of an interpolation algorithm able to maintain an analytical consistency between the diffusion coefficient and its derivative is mandatory if the model has to satisfy the WMC. Also a dynamical time step selection scheme based on the diffusion coefficient shape is introduced, and the criteria for the integration step selection are discussed. Absolute and relative dispersion experiments are made with various unresolved motion settings for the LSM on LES data, and the results are compared with laboratory data. The study shows that the unresolved turbulence parameterization has a negligible influence on the absolute dispersion, while it affects the contribution of the relative dispersion and meandering to absolute dispersion, as well as the Lagrangian correlation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scopo di questa tesi è argomentare l’utilità dello shadowing nella formazione degli interpreti, basandosi sulla Teoria motoria della percezione del linguaggio di Alvin Liberman e muovendosi all’interno del quadro teorico della più ampia embodied cognition, che include teorie sullo sviluppo del linguaggio e sull’acquisizione di seconde lingue. Nella formazione degli interpreti, lo shadowing è un esercizio che consiste nell’immediata ripetizione di quanto udito in cuffia, parola per parola e nella medesima lingua del testo di partenza ed è generalmente utilizzato come esercizio propedeutico alla simultanea, in quanto permette sia di “imparare” ad ascoltare e a parlare contemporaneamente, sia di migliorare la pronuncia e la fluidità in lingua straniera. Tuttavia, all’interno degli Interpreting Studies, ci sono studiosi che lo ritengono un esercizio inutile e, per certi versi, pericoloso poiché porrebbe l’accento su un processo eccessivamente “meccanico” dell’interpretazione. Per argomentare la sua utilità nella didattica dell’interpretazione, in questa tesi, dopo aver presentato le principali teorie sullo sviluppo del linguaggio e sull’acquisizione di seconde lingue, si passeranno in rassegna i risultati di ricerche condotte non solo all’interno degli Interpreting Studies, ma anche nella più ampia prospettiva della didattica delle lingue straniere/seconde, e soprattutto in neurolinguistica e psicologia cognitiva, dove lo shadowing è utilizzato per analizzare i processi cognitivi che sono alla base della ricezione e produzione del linguaggio (articolazione motoria, memoria di lavoro, attenzione selettiva, ecc.). L’ultimo capitolo di questo lavoro sarà dedicato alla descrizione di un approccio estremamente recente sulla percezione e sulla produzione del linguaggio, che coniuga la Teoria motoria della percezione del linguaggio di Liberman (1967) con la recente scoperta dei neuroni specchio, e che getta una luce nuova sull’utilità dello shadowing nella formazione degli interpreti.