924 resultados para Geophysical instruments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate the influence of extractives, lignin and holocellulose contents on performance index (PI) of seven woods used or tested for violin bows. Woods with higher values of this index (PI = root MOE/rho, where MOE is modulus of elasticity and rho is density) have a higher bending stiffness at a given mass, which can be related to bow wood quality. Extractive content was negatively correlated with PI in Caesalpinia echinata, Hanclroanthus sp. and Astronium lecointei. In C. echinata holocellulose was positively correlated with PI. These results need to be further explored with more samples and by testing additional wood properties. Although the chemical constituents could provide an indication of quality, it is not possible to establish appropriate woods for bows solely by examining their chemical constituents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to evaluate the efficacy of three rotary instrument systems (K3, Pro Taper and Twisted File) in removing calcium hydroxide residues from root canal walls. Thirty-four human mandibular incisors were instrumented with the Pro Taper System up to the F2 instrument, irrigated with 2.5% NaOCl followed by 17% EDTA, and filled with a calcium hydroxide intracanal dressing. After 7 days, the calcium hydroxide dressing was removed using the following rotary instruments: G1. - NiTi size 25, 0.06 taper, of the K3 System; G2 - NiTi F2, of the Pro Taper System; or G3 - NiTi size 25, 0.06 taper, of the Twisted File System. The teeth were longitudinally grooved on the buccal and lingual root surfaces, split along their long axis, and their apical and cervical canal thirds were evaluated by SEM (x1000). The images were scored and the data were statistically analyzed using the Kruskall Wallis test. None of the instruments removed the calcium hydroxide dressing completely, either in the apical or cervical thirds, and no significant differences were observed among the rotary instruments tested (p > 0.05).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: This study assessed the muscular activity during root canal preparation through kinematics, kinetics, and electromyography (EMG). Material and Methods: The operators prepared one canal with RaCe rotary instruments and another with Flexofiles. The kinematics of the major joints was reconstructed using an optoelectronic system and electromyographic responses of the flexor carpi radial's, extensor carpi radialis, brachioradialis, biceps brachii, triceps brachii, middle deltoid, and upper trapezius were recorded. The joint torques of the shoulder, elbow and wrist were calculated using inverse dynamics. In the kinematic analysis, angular movements of the wrist and elbow were classified as low risk factors for work-related musculoskeletal disorders. With respect to the shoulder, the classification was medium-risk. Results: There was no significant difference revealed by the kinetic reports. The EMG results showed that for the middle deltoid and upper trapezius the rotary instrumentation elicited higher values. The flexor carpi radialis and extensor carpi radialis, as well as the brachioradialis showed a higher value with the manual method. Conclusion: The muscular recruitment for accomplishment of articular movements for root canal preparation with either the rotary or manual techniques is distinct. Nevertheless, the rotary instrument presented less difficulty in the generation of the joint torque in each articulation, thus, presenting a greater uniformity of joint torques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The aim of this study was to assess the effect of nitrogen ion implantation on the flexibility of rotary nickel-titanium (NiTi) instruments as measured by the load required to bend implanted and nonimplanted instruments at a 30 degrees angle. Methods: Thirty K3 files, size #40, 0.02 taper and 25-mm length, were allocated into 2 groups as follows: group A, 15 files exposed to nitrogen ion implantation at a dose of 2.5 x 10(17) ions/cm(2), voltage 200 KeV, current density 1 mu A/cm(2), temperature 130 degrees C, and vacuum conditions of 10 x 10(-6) mm Hg for 6 hours; and group B, 15 nonimplanted files. One extra file was used for process control. All instruments were subjected to bend testing on a modified troptometer, with measurement of the load required for flexure to an angle of 30 degrees. The Mann-Whitney U test was used for statistical analysis. Findings with P <.05 were considered significant. Results: The mean load required to bend instruments at a 30 degrees angle was 376.26 g for implanted instruments and 383.78 g for nonimplanted instruments. The difference was not statistically significant. Conclusions: Our findings show that nitrogen ion implantation has no appreciable effect on the flexibility of NiTi instruments. (J Endod 2012;38:673-675)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction / objectives The number of orthopedic surgery, especially surgery of total hip and knee, have been more frequent due to technological advances. This study aims to determine the microbial load in the instruments used in clean surgeries, quantifying and identifying the genus and species of microbial growth.Methods Orthopedic surgical instruments were immersed, after use, in sterile water, sonicated in ultrasonic washer and consecutively shaken. Then, the lavage was filtered through a 0.45micron membrane, the result was incubated in aerobic medium, anaerobic medium and medium for fungi and yeasts. Results In clean surgeries, results showed that 47% of used instruments had microbiological growth in the range of 1 to 100 CFU/instrument. The most prevalent organism was Staphylococcus coagulase negative (28%), followed by Bacillus subtilis (11%).This study refuted the hypothesis that clean surgeries happen in micro-organismsfree surgery field. Conclusion The microbiological findings reinforce the importance of antibiotic prophylaxis, practice already well established for this category of surgical procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Observing high-energy gamma-rays from Active Galactic Nuclei (AGN) offers a unique potential to probe extremely tiny values of the intergalactic magnetic field (IGMF), a long standing question of astrophysics, astropa rticle physics and cosmology. Very high energy (VHE) photons from blazars propagating along the line of sight interact with the extragalactic background light (EBL) and produce e + e − pairs. Through inverse-Compton interaction, mainly on the cosmic microwave background (CMB), these pairs generate secondary GeV-TeV compo- nents accompanying the primary VHE signal. Such secondary components would be detected in the gamma-ray range as delayed “pair echos” for very weak IGMF ( B< 10 − 16 G ), while they should result in a spatially extended ga mma-ray emission around the source for higher IGMF values ( B> 10 − 16 G ). Coordinated observations with space (i.e. Fermi) and ground- based gamma-ray instruments, such as the pre sent Cherenkov experiments H.E.S.S., MAGIC and VERITAS, the future Cherenkov Telescope Array (CTA) Observatory, and the wide-field detectors such as HAWC and LHAASO, should allow to analyze and finally detect such echos, extended emission or pair halos, and to further characterize the IGMF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] The editorial and review processes along the road to publication are described in general terms. The construction of a well-prepared article and the manner in which authors may maximise the chances of success at each stage of the process towards final publication are explored. The most common errors and ways of avoiding them are outlined. Typical problems facing an author writing in English as a second language, including the need for grammatical precision and appropriate style, are discussed. Additionally, the meaning of plagiarism, self-plagiarism and duplicate publication is explored. Critical steps in manuscript preparation and response to reviews are examined. Finally, the relation between writing and reviewing is outlined, and it is shown how becoming a good reviewer helps in becoming a successful author

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subduction zones are the favorite places to generate tsunamigenic earthquakes, where friction between oceanic and continental plates causes the occurrence of a strong seismicity. The topics and the methodologies discussed in this thesis are focussed to the understanding of the rupture process of the seismic sources of great earthquakes that generate tsunamis. The tsunamigenesis is controlled by several kinematical characteristic of the parent earthquake, as the focal mechanism, the depth of the rupture, the slip distribution along the fault area and by the mechanical properties of the source zone. Each of these factors plays a fundamental role in the tsunami generation. Therefore, inferring the source parameters of tsunamigenic earthquakes is crucial to understand the generation of the consequent tsunami and so to mitigate the risk along the coasts. The typical way to proceed when we want to gather information regarding the source process is to have recourse to the inversion of geophysical data that are available. Tsunami data, moreover, are useful to constrain the portion of the fault area that extends offshore, generally close to the trench that, on the contrary, other kinds of data are not able to constrain. In this thesis I have discussed the rupture process of some recent tsunamigenic events, as inferred by means of an inverse method. I have presented the 2003 Tokachi-Oki (Japan) earthquake (Mw 8.1). In this study the slip distribution on the fault has been inferred by inverting tsunami waveform, GPS, and bottom-pressure data. The joint inversion of tsunami and geodetic data has revealed a much better constrain for the slip distribution on the fault rather than the separate inversions of single datasets. Then we have studied the earthquake occurred on 2007 in southern Sumatra (Mw 8.4). By inverting several tsunami waveforms, both in the near and in the far field, we have determined the slip distribution and the mean rupture velocity along the causative fault. Since the largest patch of slip was concentrated on the deepest part of the fault, this is the likely reason for the small tsunami waves that followed the earthquake, pointing out how much the depth of the rupture plays a crucial role in controlling the tsunamigenesis. Finally, we have presented a new rupture model for the great 2004 Sumatra earthquake (Mw 9.2). We have performed the joint inversion of tsunami waveform, GPS and satellite altimetry data, to infer the slip distribution, the slip direction, and the rupture velocity on the fault. Furthermore, in this work we have presented a novel method to estimate, in a self-consistent way, the average rigidity of the source zone. The estimation of the source zone rigidity is important since it may play a significant role in the tsunami generation and, particularly for slow earthquakes, a low rigidity value is sometimes necessary to explain how a relatively low seismic moment earthquake may generate significant tsunamis; this latter point may be relevant for explaining the mechanics of the tsunami earthquakes, one of the open issues in present day seismology. The investigation of these tsunamigenic earthquakes has underlined the importance to use a joint inversion of different geophysical data to determine the rupture characteristics. The results shown here have important implications for the implementation of new tsunami warning systems – particularly in the near-field – the improvement of the current ones, and furthermore for the planning of the inundation maps for tsunami-hazard assessment along the coastal area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of the 3D multipole probability tomography method (3D GPT) to image source poles, dipoles, quadrupoles and octopoles, of a geophysical vector or scalar field dataset is developed. A geophysical dataset is assumed to be the response of an aggregation of poles, dipoles, quadrupoles and octopoles. These physical sources are used to reconstruct without a priori assumptions the most probable position and shape of the true geophysical buried sources, by determining the location of their centres and critical points of their boundaries, as corners, wedges and vertices. This theory, then, is adapted to the geoelectrical, gravity and self potential methods. A few synthetic examples using simple geometries and three field examples are discussed in order to demonstrate the notably enhanced resolution power of the new approach. At first, the application to a field example related to a dipole–dipole geoelectrical survey carried out in the archaeological park of Pompei is presented. The survey was finalised to recognize remains of the ancient Roman urban network including roads, squares and buildings, which were buried under the thick pyroclastic cover fallen during the 79 AD Vesuvius eruption. The revealed anomaly structures are ascribed to wellpreserved remnants of some aligned walls of Roman edifices, buried and partially destroyed by the 79 AD Vesuvius pyroclastic fall. Then, a field example related to a gravity survey carried out in the volcanic area of Mount Etna (Sicily, Italy) is presented, aimed at imaging as accurately as possible the differential mass density structure within the first few km of depth inside the volcanic apparatus. An assemblage of vertical prismatic blocks appears to be the most probable gravity model of the Etna apparatus within the first 5 km of depth below sea level. Finally, an experimental SP dataset collected in the Mt. Somma-Vesuvius volcanic district (Naples, Italy) is elaborated in order to define location and shape of the sources of two SP anomalies of opposite sign detected in the northwestern sector of the surveyed area. The modelled sources are interpreted as the polarization state induced by an intense hydrothermal convective flow mechanism within the volcanic apparatus, from the free surface down to about 3 km of depth b.s.l..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gli strumenti chirurgici sono importanti “devices” utilizzati come supporto indi-spensabile nella cura di pazienti negli ospedali. Essi sono caratterizzati da un intero ciclo di vita che inizia convenzionalmente nello “Store”, dove gli strumenti sterilizzati sono prelevati per essere utilizzati all’interno delle sale operatorie, e termina nuovamente nello “Store”, dove gli strumenti vengono immagazzinati per essere riutilizzati in un nuovo ciclo. Può accadere che le singole fasi del ciclo subiscano ritardi rispetto ai tempi previ-sti, non assicurando, pertanto, nelle sale operatorie, il corretto numero degli stru-menti secondo i tempi programmati. Il progetto che vado ad illustrare ha come obiettivo l’ottimizzazione del ciclo degli strumenti chirurgici all’interno di un nuovo ospedale, applicando i principi della Lean philosophy ed in particolare i metodi: “Poke Yoke, 5S e tracciabilità”. Per raggiungere tale scopo, il progetto è stato articolato come segue. In un primo momento si è osservato l’intero ciclo di vita degli strumenti nei due principali ospedali di Copenhagen (Hervel e Gentofte hospital). Ciò ha permesso di rilevare gli steps del ciclo, nonché di riscontrare sul campo i principali problemi relativi al ciclo stesso quali: bassa flessiblità, decentramento dei differenti reparti di cleaning e di store rispetto alle operation theatres ed un problema nel solleva-mento degli strumenti pesanti. Raccolte le dovute informazioni, si è passati alla fase sperimentale, in cui sono stati mappati due cicli di vita differenti, utilizzando tre strumenti di analisi: • Idef0 che consente di avere una visione gerarchica del ciclo; • Value stream Mapping che permette di evidenziare i principali sprechi del ciclo; • Simulator Tecnomatix che favorisce un punto di vista dinamico dell’analisi. Il primo ciclo mappato è stato creato con il solo scopo di mettere in risalto gli steps del ciclo e alcuni problemi rincontrati all’interno degli ospedali visitati. Il secondo ciclo, invece, è stato creato in ottica Lean al fine di risolvere alcuni tra i principali problemi riscontrati nei due ospedali e ottimizzare il primo ciclo. Si ricordi, infatti, che nel secondo ciclo le principali innovazioni introdotte sono state: l’utilizzo del Barcode e Rfid Tag per identificare e tracciare la posizione degli items, l’uso di un “Automatic and Retrievial Store” per minimizzare i tempi di inserimento e prelievo degli items e infine l’utilizzo di tre tipologie di carrello, per consentire un flessibile servizio di cura. Inoltre sono state proposte delle solu-zioni “Poke-Yoke” per risolvere alcuni problemi manuali degli ospedali. Per evidenziare il vantaggio del secondo ciclo di strumenti, è stato preso in consi-derazione il parametro “Lead time”e le due simulazioni, precedentemente create, sono state confrontate. Tale confronto ha evidenziato una radicale riduzione dei tempi (nonché dei costi associati) della nuova soluzione rispetto alla prima. Alla presente segue la trattazione in lingua inglese degli argomenti oggetto di ri-cerca. Buona lettura.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Basic concepts and definitions relative to Lagrangian Particle Dispersion Models (LPDMs)for the description of turbulent dispersion are introduced. The study focusses on LPDMs that use as input, for the large scale motion, fields produced by Eulerian models, with the small scale motions described by Lagrangian Stochastic Models (LSMs). The data of two different dynamical model have been used: a Large Eddy Simulation (LES) and a General Circulation Model (GCM). After reviewing the small scale closure adopted by the Eulerian model, the development and implementation of appropriate LSMs is outlined. The basic requirement of every LPDM used in this work is its fullfillment of the Well Mixed Condition (WMC). For the dispersion description in the GCM domain, a stochastic model of Markov order 0, consistent with the eddy-viscosity closure of the dynamical model, is implemented. A LSM of Markov order 1, more suitable for shorter timescales, has been implemented for the description of the unresolved motion of the LES fields. Different assumptions on the small scale correlation time are made. Tests of the LSM on GCM fields suggest that the use of an interpolation algorithm able to maintain an analytical consistency between the diffusion coefficient and its derivative is mandatory if the model has to satisfy the WMC. Also a dynamical time step selection scheme based on the diffusion coefficient shape is introduced, and the criteria for the integration step selection are discussed. Absolute and relative dispersion experiments are made with various unresolved motion settings for the LSM on LES data, and the results are compared with laboratory data. The study shows that the unresolved turbulence parameterization has a negligible influence on the absolute dispersion, while it affects the contribution of the relative dispersion and meandering to absolute dispersion, as well as the Lagrangian correlation.