919 resultados para Fault locator
Resumo:
This study provided information about how individual workers perceive, describe and interpret episodes of problematic communication. Sixteen full-time workers (5 males, 11 females) were interviewed in depth about specific incidents of problematic communication within their workplace. Their descriptions of the attributed causes of the incidents were coded using a categorisation scheme developed from Coupland, Wieman, and Giles' (1991) model of sources of problematic communication. Communication problems were most commonly attributed to individual deficiency and group membership, although there were differences depending on the direction of communication. The most negative attributions (to personality flaws, to lack of skills, and to negative stereotypes of the outgroup) were most commonly applied by individuals to their supervisors, whilst attributions applied to co-workers and subordinates tended to be less negative, or even positive in some instances (where individuals attributed the fault to themselves). Overall, results highlighted distinctions between the perceptions of communication problems with supervisors and with subordinates, and are interpreted with reference to social identity theory.
Resumo:
In order to investigate the effect of material anisotropy on convective instability of three-dimensional fluid-saturated faults, an exact analytical solution for the critical Rayleigh number of three-dimensional convective flow has been obtained. Using this critical Rayleigh number, effects of different permeability ratios and thermal conductivity ratios on convective instability of a vertically oriented three-dimensional fault have been examined in detail. It has been recognized that (1) if the fault material is isotropic in the horizontal direction, the horizontal to vertical permeability ratio has a significant effect on the critical Rayleigh number of the three-dimensional fault system, but the horizontal to vertical thermal conductivity ratio has little influence on the convective instability of the system, and (2) if the fault material is isotropic in the fault plane, the thermal conductivity ratio of the fault normal to plane has a considerable effect on the critical Rayleigh number of the three-dimensional fault system, but the effect of the permeability ratio of the fault normal to plane on the critical Rayleigh number of three-dimensional convective flow is negligible.
Resumo:
Extension of overthickened continental crust is commonly characterized by an early core complex stage of extension followed by a later stage of crustal-scale rigid block faulting. These two stages are clearly recognized during the extensional destruction of the Alpine orogen in northeast Corsica, where rigid block faulting overprinting core complex formation eventually led to crustal separation and the formation of a new oceanic backarc basin (the Ligurian Sea). Here we investigate the geodynamic evolution of continental extension by using a novel, fully coupled thermomechanical numerical model of the continental crust. We consider that the dynamic evolution is governed by fault weakening, which is generated by the evolution of the natural-state variables (i.e., pressure, deviatoric stress, temperature, and strain rate) and their associated energy fluxes. Our results show the appearance of a detachment layer that controls the initial separation of the brittle crust on characteristic listric faults, and a core complex formation that is exhuming strongly deformed rocks of the detachment zone and relatively undeformed crustal cores. This process is followed by a transitional period, characterized by an apparent tectonic quiescence, in which deformation is not localized and energy stored in the upper crust is transferred downward and causes self-organized mobilization of the lower crust. Eventually, the entire crust ruptures on major crosscutting faults, shifting the tectonic regime from core complex formation to wholesale rigid block faulting.
Resumo:
Introduction: The aim of this study was to compare the influence of preflaring on the accuracy of 4 electronic apex locators (EALs): Root ZX, Elements Diagnostic Unit and Apex Locator, Mini Apex Locator, and Apex DSP. Methods: Forty extracted teeth were preflared by using S1 and SX ProTaper instruments. The working length was established by reducing 1 mm from the total length (TL). The ability of the EALs to detect precise (-1 mm from TL) and acceptable (-1+/-0.5 mm from TL) measurements in unflared and preflared canals was determined. Results: The precise and acceptable (P/A) readings in unflared canals for Root ZX, Elements Diagnostic Unit and Apex Locator, Mini Apex and Apex DSP were 50%/97.5%, 47.5%/95%, 50%/97.5%, and 45%/67.5%, respectively. For preflared canals, the readings were 75%/97.5%, 55%/95%, 75%/97.5%, and 60%/87.5%, respectively. For precise criteria, the preflared procedure increased the percentage of accurate electronic readings for the Root ZX and the Mini Apex Locator (P < .05). For acceptable criteria, no differences were found among Root ZX, Elements Diagnostic Unit and Apex Locator, and Mini Apex Locator (P > .05). Fisher test indicated the lower accuracy for Apex DSP (P < .05). Conclusions: The Root ZX and the Mini Apex Locator devices increased significantly the precision to determine the real working length after the preflaring procedure. All the EALs showed an acceptable determination of the working length between the ranges of+/-0.5mm except for the Apex DSP device, which had the lowest accuracy. (J Endod 2009;35:1300-1302)
Resumo:
P>This study assessed the effect of simulated mastication on the retention of two stud attachment systems for 2-implants overdentures. Sixteen specimens, each simulating an edentulous ridge with implants and an overdenture were divided into two groups, according to the attachment system: Group I (Nobel Biocare ball-socket attachments) and Group II (Locator attachments). Retention forces were measured before and after 400 000 simulated masticatory loads in a customised device. Data were compared by two-way anova followed by Bonferroni test (alpha = 0 center dot 05). Group I presented significantly lower retention forces (Newtons) than Group II at baseline (10 center dot 6 +/- 3 center dot 6 and 66 center dot 4 +/- 16 center dot 0, respectively). However, differences were not significant after 400 000 loads (7 center dot 9 +/- 4 center dot 3 and 21 center dot 6 +/- 17 center dot 0). The number of cycles did not influence the measurements in Group I, whereas a non-linear descending curve was found for Group II. It was concluded that simulated mastication resulted in minor changes for the ball attachment tested. Nevertheless, it reduced the retention of Locator attachments to 40% of the baseline values, what suggests that mastication is a major factor associated with maintenance needs for this system.
Resumo:
P>The aim of this study was to validate an original portable device to measure attachment retention of implant overdentures both in the lab and in clinical settings. The device was built with a digital force measurement gauge (Imada) secured to a vertical wheel stand associated with a customized support to hold and position the denture in adjustable angulations. Sixteen matrix and patrix cylindrical stud attachments (Locator (R)) were randomly assigned as in vitro test specimens. Attachment abutments were secured in an implant analogue hung to the digital force gauge or to the load cell of a traction machine used as the gold standard (Instron Universal Testing Machine). Matrices were secured in a denture duplicate attached to the customized support, permitting reproducibility of their position on both pulling devices. Attachment retention in the axial direction was evaluated by measuring maximum dislodging force or peak load during five consecutive linear dislodgments of each attachment on both devices. After a wear simulation, retention was measured again at several time periods. The peak load measurements with the customized Imada device were similar to those obtained with the gold standard Instron machine. These findings suggest that the proposed portable device can provide accurate information on the retentive properties of attachment systems for removable dental prostheses.
Resumo:
Observations of accelerating seismic activity prior to large earthquakes in natural fault systems have raised hopes for intermediate-term eartquake forecasting. If this phenomena does exist, then what causes it to occur? Recent theoretical work suggests that the accelerating seismic release sequence is a symptom of increasing long-wavelength stress correlation in the fault region. A more traditional explanation, based on Reid's elastic rebound theory, argues that an accelerating sequence of seismic energy release could be a consequence of increasing stress in a fault system whose stress moment release is dominated by large events. Both of these theories are examined using two discrete models of seismicity: a Burridge-Knopoff block-slider model and an elastic continuum based model. Both models display an accelerating release of seismic energy prior to large simulated earthquakes. In both models there is a correlation between the rate of seismic energy release with the total root-mean-squared stress and the level of long-wavelength stress correlation. Furthermore, both models exhibit a systematic increase in the number of large events at high stress and high long-wavelength stress correlation levels. These results suggest that either explanation is plausible for the accelerating moment release in the models examined. A statistical model based on the Burridge-Knopoff block-slider is constructed which indicates that stress alone is sufficient to produce accelerating release of seismic energy with time prior to a large earthquake.
Resumo:
Fault detection and isolation (FDI) are important steps in the monitoring and supervision of industrial processes. Biological wastewater treatment (WWT) plants are difficult to model, and hence to monitor, because of the complexity of the biological reactions and because plant influent and disturbances are highly variable and/or unmeasured. Multivariate statistical models have been developed for a wide variety of situations over the past few decades, proving successful in many applications. In this paper we develop a new monitoring algorithm based on Principal Components Analysis (PCA). It can be seen equivalently as making Multiscale PCA (MSPCA) adaptive, or as a multiscale decomposition of adaptive PCA. Adaptive Multiscale PCA (AdMSPCA) exploits the changing multivariate relationships between variables at different time-scales. Adaptation of scale PCA models over time permits them to follow the evolution of the process, inputs or disturbances. Performance of AdMSPCA and adaptive PCA on a real WWT data set is compared and contrasted. The most significant difference observed was the ability of AdMSPCA to adapt to a much wider range of changes. This was mainly due to the flexibility afforded by allowing each scale model to adapt whenever it did not signal an abnormal event at that scale. Relative detection speeds were examined only summarily, but seemed to depend on the characteristics of the faults/disturbances. The results of the algorithms were similar for sudden changes, but AdMSPCA appeared more sensitive to slower changes.
Resumo:
The hanging wall of the Alpine Fault near Franz Josef Glacier has been exhumed during the past similar to2-3 m.y. providing a sample of the ductilely deformed middle crust of a modem obliquely convergent orogen. Presently exposed rocks of the Pacific Plate are inferred to have undergone several phases of ductile deformation as they moved westward above a mid-crustal detachment. Initially they were transpressed across the outboard part of the orogen, resulting in oblate fabrics with a down-dip stretch. Later, they encountered the Alpine Fault, experiencing an oblique-slip backshearing on vertical planes. This escalator-like deformation tilted and thinned the incoming crust onto that crustal-scale oblique ramp. This style of hanging wall deformation may affect only the most rapidly uplifting, central part of the Southern Alps because of the low flexural rigidity of the crust in that region and its displacement over a relatively sharp ramp-angle at depth. A 3D transpressive flow affected mylonites locally near the fault, but their shear direction remained parallel to plate motion, ruling out ductile 'extrusion' as an important process in this orogen. Outside the mylonite zone, late Cenozoic shortening is inferred to be modest (30-40%), as measured from deformation of younger biotite grains. Oblique collision is dominated by translation on the Alpine Fault, and rocks migrate rapidly through the deforming zone, preventing the accumulation of large finite strains. Transpression may play a minor role in oblique collision. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.
Resumo:
We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.
Resumo:
The particle-based Lattice Solid Model (LSM) was developed to provide a basis to study the physics of rocks and the nonlinear dynamics of earthquakes (MORA and PLACE, 1994; PLACE and MORA, 1999). A new modular and flexible LSM approach has been developed that allows different microphysics to be easily included in or removed from the model. The approach provides a virtual laboratory where numerical experiments can easily be set up and all measurable quantities visualised. The proposed approach provides a means to simulate complex phenomena such as fracturing or localisation processes, and enables the effect of different micro-physics on macroscopic behaviour to be studied. The initial 2-D model is extended to allow three-dimensional simulations to be performed and particles of different sizes to be specified. Numerical bi-axial compression experiments under different confining pressure are used to calibrate the model. By tuning the different microscopic parameters (such as coefficient of friction, microscopic strength and distribution of grain sizes), the macroscopic strength of the material and can be adjusted to be in agreement with laboratory experiments, and the orientation of fractures is consistent with the theoretical value predicted based on Mohr-Coulomb diagram. Simulations indicate that 3-D numerical models have different macroscopic properties than in 2-D and, hence, the model must be recalibrated for 3-D simulations. These numerical experiments illustrate that the new approach is capable of simulating typical rock fracture behaviour. The new model provides a basis to investigate nucleation, rupture and slip pulse propagation in complex fault zones without the previous model limitations of a regular low-level surface geometry and being restricted to two-dimensions.
Resumo:
The Commonwealth Government's Principles Based Review of the Law of Negligence recently recommended reforms aimed at limiting liability and damages arising from personal injury and death, in response to the growing perception that the current system of compensating personal injury had become financially unsustainable. Recent increases in medical liability and damages have eroded the confidence of doctors and their professional bodies, with fears of unprecedented desertion from and reduced recruitment into high risk areas, and one of the primary foci of the review concerned medical negligence. The article analyses proposals to redefine the principles necessary for the finding of negligence, against the terms of reference of the review. The article assumes that for the foreseeable future, Australia will persist with tort-based compensation for personal injury rather than developing a no-fault scheme. If the suggested changes to the fundamental principles of negligence are unlikely to reduce medical liability, greater attention might be given to the processes which come into play after the finding of negligence, where reform is more likely to benefit both plaintiffs and defendants.
Resumo:
Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI's code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper presents a generic model for language-independent reverse engineering of graphical user interface based applications, and we explore the integration of model-based testing techniques in our approach, thus allowing us to perform fault detection. A prototype tool has been constructed, which is already capable of deriving and testing a user interface behavioral model of applications written in Java/Swing.