35 resultados para Hazard, Interactive Failure, Repairable Systems, Asset Maintenance
Resumo:
The advent and evolution of geohazard warning systems is a very interesting study. The two broad fields that are immediately visible are that of geohazard evaluation and subsequent warning dissemination. Evidently, the latter field lacks any systematic study or standards. Arbitrarily organized and vague data and information on warning techniques create confusion and indecision. The purpose of this review is to try and systematize the available bulk of information on warning systems so that meaningful insights can be derived through decidable flowcharts, and a developmental process can be undertaken. Hence, the methods and technologies for numerous geohazard warning systems have been assessed by putting them into suitable categories for better understanding of possible ways to analyze their efficacy as well as shortcomings. By establishing a classification scheme based on extent, control, time period, and advancements in technology, the geohazard warning systems available in any literature could be comprehensively analyzed and evaluated. Although major advancements have taken place in geohazard warning systems in recent times, they have been lacking a complete purpose. Some systems just assess the hazard and wait for other means to communicate, and some are designed only for communication and wait for the hazard information to be provided, which usually is after the mishap. Primarily, systems are left at the mercy of administrators and service providers and are not in real time. An integrated hazard evaluation and warning dissemination system could solve this problem. Warning systems have also suffered from complexity of nature, requirement of expert-level monitoring, extensive and dedicated infrastructural setups, and so on. The user community, which would greatly appreciate having a convenient, fast, and generalized warning methodology, is surveyed in this review. The review concludes with the future scope of research in the field of hazard warning systems and some suggestions for developing an efficient mechanism toward the development of an automated integrated geohazard warning system. DOI: 10.1061/(ASCE)NH.1527-6996.0000078. (C) 2012 American Society of Civil Engineers.
Resumo:
In recent years, the time dependant maintenance of expensive high voltage power equipments is getting replaced by condition based maintenance so as to detect apriori an impending failure of the equipment. For condition based maintenance, most monitoring systems concentrate on the electrical quantities such as measurement and evaluation of partial discharges, tan delta, tip-up test, dielectric strength, insulation resistance, polarization and depolarization current. However, in the case of equipments being developed with novel nanodielectric insulating materials, the variation in these parameters before an impending failure is not available. Hence in this work, accelerated electrothermal aging studies have been conducted on unfilled epoxy as well as epoxy nanocomposite samples of 5 wt% filler loading, and the tan d values were continuously monitored to obtain the condition of the samples under study. It was observed that those samples whose tan d increased at a rapid rate failed first.
Resumo:
The primary structure and function of nucleoside diphosphate kinase (NDK), a substrate non-specific enzyme involved in the maintenance of nucleotide pools is also implicated to play pivotal roles in many other cellular processes. NDK is conserved from bacteria to human and forms a homotetramer or hexamer to exhibit its biological activity. However, the nature of the functional oligomeric form of the enzyme differs among different organisms. The functional form of NDKs from many bacterial systems, including that of the human pathogen, Mycobacterium tuberculosis (MtuNDK), is a hexamer, although some bacterial NDKs are tetrameric in nature. The present study addresses the oligomeric property of MsmNDK and how a dimer, the basic subunit of a functional hexamer, is stabilized by hydrogen bonds and hydrophobic interactions. Homology modeling was generated using the three-dimensional structure of MtuNDK as a template; the residues interacting at the monomer-monomer interface of MsmNDK were mapped. Using recombinant enzymes of wild type, catalytically inactive mutant, and monomer-monomer interactive mutants of MsmNDK, the stability of the dimer was verified under heat, SDS, low pH, and methanol. The predicted residues (Gln17, Ser24 and Glu27) were engaged in dimer formation, however the mutated proteins retained the ATPase and GTPase activity even after introducing single (MsmNDK- Q17A, MsmNDK-E27A, and MsmNDK-E27Q) and double (MsmNDK-E27A/Q17A) mutation. However, the monomer monomer interaction could be abolished using methanol, indicating the stabilization of the monomer-monomer interaction by hydrophobic interaction.
Resumo:
As the volume of data relating to proteins increases, researchers rely more and more on the analysis of published data, thus increasing the importance of good access to these data that vary from the supplemental material of individual articles, all the way to major reference databases with professional staff and long-term funding. Specialist protein resources fill an important middle ground, providing interactive web interfaces to their databases for a focused topic or family of proteins, using specialized approaches that are not feasible in the major reference databases. Many are labors of love, run by a single lab with little or no dedicated funding and there are many challenges to building and maintaining them. This perspective arose from a meeting of several specialist protein resources and major reference databases held at the Wellcome Trust Genome Campus (Cambridge, UK) on August 11 and 12, 2014. During this meeting some common key challenges involved in creating and maintaining such resources were discussed, along with various approaches to address them. In laying out these challenges, we aim to inform users about how these issues impact our resources and illustrate ways in which our working together could enhance their accuracy, currency, and overall value. Proteins 2015; 83:1005-1013. (c) 2015 The Authors. Proteins: Structure, Function, and Bioinformatics Published by Wiley Periodicals, Inc.
Resumo:
Exascale systems of the future are predicted to have mean time between failures (MTBF) of less than one hour. At such low MTBFs, employing periodic checkpointing alone will result in low efficiency because of the high number of application failures resulting in large amount of lost work due to rollbacks. In such scenarios, it is highly necessary to have proactive fault tolerance mechanisms that can help avoid significant number of failures. In this work, we have developed a mechanism for proactive fault tolerance using partial replication of a set of application processes. Our fault tolerance framework adaptively changes the set of replicated processes periodically based on failure predictions to avoid failures. We have developed an MPI prototype implementation, PAREP-MPI that allows changing the replica set. We have shown that our strategy involving adaptive process replication significantly outperforms existing mechanisms providing up to 20 percent improvement in application efficiency even for exascale systems.