878 resultados para Fault-proneness


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, a non-linear excitation controller using inverse filtering is proposed to damp inter-area oscillations. The proposed controller is based on determining generator flux value for the next sampling time which is obtained by maximising reduction rate of kinetic energy of the system after the fault. The desired flux for the next time interval is obtained using wide-area measurements and the equivalent area rotor angles and velocities are predicted using a non-linear Kalman filter. A supplementary control input for the excitation system, using inverse filtering approach, to track the desired flux is implemented. The inverse filtering approach ensures that the non-linearity introduced because of saturation is well compensated. The efficacy of the proposed controller with and without communication time delay is evaluated on different IEEE benchmark systems including Kundur's two area, Western System Coordinating Council three-area and 16-machine, 68-bus test systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lake Purrumbete maar is located in the intraplate, monogenetic Newer Volcanics Province in southeastern Australia. The extremely large crater of 3000. m in diameter formed on an intersection of two fault lines and comprises at least three coalesced vents. The evolution of these vents is controlled by the interaction of the tectonic setting and the properties of both hard and soft rock aquifers. Lithics in the maar deposits originate from country rock formations less than 300. m deep, indicating that the large size of the crater cannot only be the result of the downwards migration of the explosion foci in a single vent. Vertical crater walls and primary inward dipping beds evidence that the original size of the crater has been largely preserved. Detailed mapping of the facies distributions, the direction of transport of base surges and pyroclastic flows, and the distribution of ballistic block fields, form the basis for the reconstruction of the complex eruption history,which is characterised by alternations of the eruption style between relatively dry and wet phreatomagmatic conditions, and migration of the vent location along tectonic structures. Three temporally separated eruption phases are recognised, each starting at the same crater located directly at the intersection of two local fault lines. Activity then moved quickly to different locations. A significant volcanic hiatus between two of the three phases shows that the magmatic system was reactivated. The enlargement of especially the main crater by both lateral and vertical growth led to the interception of the individual craters and the formation of the large circular crater. Lake Purrumbete maar is an excellent example of how complicated the evolution of large, seemingly simple, circular maar volcanoes can be, and raises the question if these systems are actually monogenetic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Encroaching built environment with increased fault current levels is demanding a robust design approach and prolonged improved performance of the earth grid. With this in mind, the aim of the project was to perform a sensitivity analysis of the earth grid and an earthing performance evaluation with graphene coated conductors. Subsequent to these, a conceptual design to continuously monitor the performance of the earth grid was developed. In this study, earth grid design standards were compared to evaluate their appropriate use in determining the safety condition. A process to grow a thin film of graphene on the surface of cylindrical copper rods was developed to evaluate earthing performance in terms of conductivity and corrosion susceptibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Any kind of imbalance in the operation of a wind turbine has adverse effect on the downstream torsional components as well as tower structure. It is crucial to detect imbalance at its very inception. The identification of the type of imbalance is also required so that appropriate measures of fault accommodation can be performed in the control system. In particular, it is important to distinguish between mass and aerodynamic imbalance. While the former is gradually caused by a structural anomaly (e.g. ice deposition, moisture accumulation inside blade), the latter is generally associated to a fault in the pitch control system. This paper proposes a technique for the detection and identification of imbalance fault in large scale wind turbines. Unlike most other existing method it requires only the rotor speed signal which is readily available in existing turbines. Signature frequencies have been proposed in this work to identify imbalance type based on their physical phenomenology. The performance of this technique has been evaluated by simulations using an existing benchmark model. The effectiveness of the proposed method has been confirmed by the simulation results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Young drivers represent approximately 20% of the Omani population, yet account for over one third of crash injuries and fatalities on Oman's roads. Internationally, research has demonstrated that social influences play an important role within young driver safety, however, there is little research examining this within Arab gulf countries. This study sought to explore young driver behaviour using Akers' social learning theory. A self-report survey was conducted by 1319 (72.9% male and 27.1% female) young drivers aged 17-25 years. A hierarchical regression model was used to investigate the contribution of social learning variables (norms and behaviour of significant others, personal attitudes towards risky behaviour, imitation of significant others, beliefs about the rewards and punishments offered by risky behaviour), socio-demographic characteristics (age and gender), driving experience (initial training, time driving and previous driving without supervision) and sensitivity to rewards and punishments upon the self-reported risky driving behaviours of young drivers. It was found that 39.6% of the young drivers reported that they have been involved in at least one crash since the issuance of their driving licence and they were considered ‘at fault’ in 60.7% of these crashes. The hierarchical multiple regression models revealed that socio-demographic characteristics and driving experience alone explained 14.2% of the variance in risky driving behaviour. By introducing social learning factors into the model a further 37.0% of variance was explained. Finally, 7.9% of the variance in risky behaviour could be explained by including individual sensitivity to rewards and punishments. These findings and the implications are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To examine the association between glaucoma and motor vehicle collision (MVC) involvement among older drivers, including the role of visual field impairment that may underlie any association found. Design A retrospective population-based study Participants A sample of 2,000 licensed drivers aged 70 years and older who reside in north central Alabama. Methods At-fault MVC involvement for five years prior to enrollment was obtained from state records. Three aspects of visual function were measured: habitual binocular distance visual acuity, binocular contrast sensitivity and the binocular driving visual field constructed from combining the monocular visual fields of each eye. Poisson regression was used to calculate crude and adjusted rate ratios (RR) and 95% confidence intervals (CI). Main Outcomes Measures At-fault MVC involvement for five years prior to enrollment. Results Drivers with glaucoma (n = 206) had a 1.65 (95% confidence interval [CI] 1.20-2.28, p = 0.002) times higher MVC rate compared to those without glaucoma after adjusting for age, gender and mental status. Among those with glaucoma, drivers with severe visual field loss had higher MVC rates (RR = 2.11, 95% CI 1.09-4.09, p = 0.027), whereas no significant association was found among those with impaired visual acuity and contrast sensitivity. When the visual field was sub-divided into six regions (upper, lower, left, and right visual fields; horizontal and vertical meridians), we found that impairment in the left, upper or lower visual field was associated with higher MVC rates, and an impaired left visual field showed the highest RR (RR = 3.16, p = 0.001) compared to other regions. However, no significant association was found in deficits in the right side or along the horizontal or vertical meridian. Conclusions A population-based study suggests that older drivers with glaucoma are more likely to have a history of at-fault MVC involvement than those without glaucoma. Impairment in the driving visual field in drivers with glaucoma appears to have an independent association with at-fault MVC involvement, whereas visual acuity and contrast sensitivity impairments do not.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evidence increasingly suggests that our behaviour on the road mirrors our behaviour across other aspects of our life. The idea that we drive as we live, described by Tillman and Hobbs more than 65 years ago when examining off-road behaviours of taxi drivers (1949), is the focus of the current paper. As part of a larger study examining the impact of penalty changes on a large cohort of Queensland speeding offenders, criminal (lifetime) and crash history (10 year period) data for a sub-sample of 1000 offenders were obtained. Based on the ‘drive as we live’ maxim, it was hypothesised that crash-involved speeding offenders would be more likely to have a criminal history than non-crash involved offenders. Overall, only 30% of speeding offenders had a criminal history. However, crash-involved offenders were significantly more likely to have a criminal history (49.4%) than non-crash involved offenders (28.6%), supporting the hypothesis. Furthermore, those deemed ‘most at fault’ in a crash were the group most likely to have at least one criminal offence (52.2%). When compared to the non-crash involved offenders, those deemed ‘not most at fault’ in a crash were also more likely to have had at least one criminal offence (46.5%). Therefore, when compared to non-crash involved speeding offenders, those offenders involved in a crash were more likely to have been convicted of at least one criminal offence, irrespective of whether they were deemed ‘most at fault’ in that crash. Implications for traffic offender management and policing are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the modeling and analysis of a voltage source converter (VSC) based back-to-back (BTB) HVDC link. The case study considers the response to changes in the active and reactive power and disturbance caused by single line to ground (SLG) fault. The controllers at each terminal are designed to inject a variable (magnitude and phase angle) sinusoidal, balanced set of voltages to regulate/control the active and reactive power. It is also possible to regulate the converter bus (AC) voltage by controlling the injected reactive power. The analysis is carried out using both d-q model (neglecting the harmonics in the output voltages of VSC) and three phase detailed model of VSC. While the eigenvalue analysis and controller design is based on the d-q model, the transient simulation considers both models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simultaneous consideration of both performance and reliability issues is important in the choice of computer architectures for real-time aerospace applications. One of the requirements for such a fault-tolerant computer system is the characteristic of graceful degradation. A shared and replicated resources computing system represents such an architecture. In this paper, a combinatorial model is used for the evaluation of the instruction execution rate of a degradable, replicated resources computing system such as a modular multiprocessor system. Next, a method is presented to evaluate the computation reliability of such a system utilizing a reliability graph model and the instruction execution rate. Finally, this computation reliability measure, which simultaneously describes both performance and reliability, is applied as a constraint in an architecture optimization model for such computing systems. Index Terms-Architecture optimization, computation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is aimed at reviewing the notion of Byzantine-resilient distributed computing systems, the relevant protocols and their possible applications as reported in the literature. The three agreement problems, namely, the consensus problem, the interactive consistency problem, and the generals problem have been discussed. Various agreement protocols for the Byzantine generals problem have been summarized in terms of their performance and level of fault-tolerance. The three classes of Byzantine agreement protocols discussed are the deterministic, randomized, and approximate agreement protocols. Finally, application of the Byzantine agreement protocols to clock synchronization is highlighted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a seven-level inverter structure for open-end winding induction motor drives. The inverter supply is realized by cascading four two-level and two three-level neutral-point-clamped inverters. The inverter control is designed in such a way that the common-mode voltage (CMV) is eliminated. DC-link capacitor voltage balancing is also achieved by using only the switching-state redundancies. The proposed power circuit structure is modular and therefore suitable for fault-tolerant applications. By appropriately isolating some of the inverters, the drive can be operated during fault conditions in a five-level or a three-level inverter mode, with preserved CMV elimination and DC-link capacitor voltage balancing, within a reduced modulation range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the validity of'single fault assumption in deriving diagnostic test sets is examined with respect to crosspoint faults in programmable logic arrays (PLA's). The control input procedure developed here can be used to convert PLA's having undetectable crosspoint faults to crosspoint-irredundant PLA's for testing purposes. All crosspoints will be testable in crosspoint-irredundant PLA's. The control inputs are used as extra variables during testing. They are maintained at logic I during normal operation. A useful heuristic for obtaining a near-minimal number of control inputs is suggested. Expressions for calculating bounds on the number of control inputs have also been obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An on-line algorithm is developed for the location of single cross point faults in a PLA (FPLA). The main feature of the algorithm is the determination of a fault set corresponding to the response obtained for a failed test. For the apparently small number of faults in this set, all other tests are generated and a fault table is formed. Subsequently, an adaptive procedure is used to diagnose the fault. Functional equivalence test is carried out to determine the actual fault class if the adaptive testing results in a set of faults with identical tests. The large amount of computation time and storage required in the determination, a priori, of all the fault equivalence classes or in the construction of a fault dictionary are not needed here. A brief study of functional equivalence among the cross point faults is also made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The opportunities and challenges faced by litigants who strategically plead intentional torts are borne out by two recent medical cases. Both arose out of dental treatment. Dean v Phung established some key principles which were clarified in White v Johnston. Before considering those two cases it is worth examining the environment in which such intentional torts claims now exist. Following the Ipp Review of the Law of Negligence, non-uniform legislative changes to the law of negligence were introduced across Australia which have imposed limitations on liability and quantum of damages in cases where a person has been injured through the fault of another. While it seems that, given the limitation of the scope of the review and recommendations to negligently caused damage, the Ipp Review reforms were meant to be limited to injury resulting from negligent acts rather than intentional torts, the extent to which the civil liability legislation applies to intentional torts differs across Australia.