949 resultados para mean time between failures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção de grau de Mestre em Engenharia Mecânica

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia e Gestão Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the methodology of linking Total Productive Maintenance (TPM) and Quality Function Deployment (QFD). The Synergic power ofTPM and QFD led to the formation of a new maintenance model named Maintenance Quality Function Deployment (MQFD). This model was found so powerful that, it could overcome the drawbacks of TPM, by taking care of customer voices. Those voices of customers are used to develop the house of quality. The outputs of house of quality, which are in the form of technical languages, are submitted to the top management for making strategic decisions. The technical languages, which are concerned with enhancing maintenance quality, are strategically directed by the top management towards their adoption of eight TPM pillars. The TPM characteristics developed through the development of eight pillars are fed into the production system, where their implementation is focused towards increasing the values of the maintenance quality parameters, namely overall equipment efficiency (GEE), mean time between failures (MTBF), mean time to repair (MTIR), performance quality, availability and mean down time (MDT). The outputs from production system are required to be reflected in the form of business values namely improved maintenance quality, increased profit, upgraded core competence, and enhanced goodwill. A unique feature of the MQFD model is that it is not necessary to change or dismantle the existing process ofdeveloping house ofquality and TPM projects, which may already be under practice in the company concerned. Thus, the MQFD model enables the tactical marriage between QFD and TPM.First, the literature was reviewed. The results of this review indicated that no activities had so far been reported on integrating QFD in TPM and vice versa. During the second phase, a survey was conducted in six companies in which TPM had been implemented. The objective of this survey was to locate any traces of QFD implementation in TPM programme being implemented in these companies. This survey results indicated that no effort on integrating QFD in TPM had been made in these companies. After completing these two phases of activities, the MQFD model was designed. The details of this work are presented in this research work. Followed by this, the explorative studies on implementing this MQFD model in real time environments were conducted. In addition to that, an empirical study was carried out to examine the receptivity of MQFD model among the practitioners and multifarious organizational cultures. Finally, a sensitivity analysis was conducted to find the hierarchy of various factors influencing MQFD in a company. Throughout the research work, the theory and practice of MQFD were juxtaposed by presenting and publishing papers among scholarly communities and conducting case studies in real time scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In high speed manufacturing systems, continuous operation is desirable, with minimal disruption for repairs and service. An intelligent diagnostic monitoring system, designed to detect developing faults before catastrophic failure, or prior to undesirable reduction in output quality, is a good means of achieving this. Artificial neural networks have already been found to be of value in fault diagnosis of machinery. The aim here is to provide a system capable of detecting a number of faults, in order that maintenance can be scheduled in advance of sudden failure, and to reduce the necessity to replace parts at intervals based on mean time between failures. Instead, parts will need to be replaced only when necessary. Analysis of control information in the form of position error data from two servomotors is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Patient-controlled epidural analgesia with low concentrations of anesthetics is effective in reducing labor pain. The aim of this study was to assess and compare two ultra-low dose regimens of ropivacaine and sufentanil (0.1% ropivacaine plus 0.5 μg.ml-1 sufentanil vs. 0.06% ropivacaine plus 0.5 μg.ml-1 sufentanil) on the intervals between boluses and the duration of labor. MATERIAL AND METHODS: In this non-randomized prospective study, conducted between January and July 2010, two groups of parturients received patient-controlled epidural analgesia: Group I (n = 58; 1 mg.ml-1 ropivacaine + 0.5 μg.ml-1 sufentanil) and Group II (n = 57; 0.6 mg.ml-1 ropivacaine + 0.5 μg.ml-1 sufentanil). Rescue doses of ropivacaine at the concentration of the assigned group without sufentanil were administered as necessary. Pain, local anesthetic requirements, neuraxial blockade characteristics, labor and neonatal outcomes, and maternal satisfaction were recorded. RESULTS: The ropivacaine dose was greater in Group I (9.5 [7.7-12.7] mg.h-1 vs. 6.1 [5.1-9.8 mg.h-1], p < 0.001). A time increase between each bolus was observed in Group I (beta = 32.61 min, 95% CI [25.39; 39.82], p < 0.001), whereas a time decrease was observed in Group II (beta = -1.40 min, 95% CI [-2.44; -0.36], p = 0.009). The duration of the second stage of labor in Group I was significantly longer than that in Group II (78 min vs. 65 min, p < 0.001). CONCLUSIONS: Parturients receiving 0.06% ropivacaine exhibited less evidence of cumulative effects and exhibited faster second stage progression than those who received 0.1% ropivacaine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The net mechanical efficiency of positive work (eta(pos)) has been shown to increase if it is immediately preceded by negative work. This phenomenon is explained by the storage of elastic energy during the negative phase and its release during the subsequent positive phase. If a transition time (T) takes place, the elastic energy is dissipated into heat. The aim of the present study was to investigate the relationship between eta(pos) and T, and to determine the minimal T required so that eta(pos) reached its minimal value. Seven healthy male subjects were tested during four series of lowering-raising of the body mass. In the first series (S (0)), the negative and positive phases were executed without any transition time. In the three other series, T was varied by a timer (0.12, 0.24 and 0.56 s for series S (1), S (2) and S (3), respectively). These exercises were performed on a force platform sensitive to vertical forces to measure the mechanical work and a gas analyser was used to determine the energy expenditure. The results indicated that eta(pos) was the highest (31.1%) for the series without any transition time (S (0)). The efficiencies observed with transition times (S (1), S (2) and S (3)) were 27.7, 26.0 and 23.8%, respectively, demonstrating that T plays an important role for mechanical efficiency. The investigation of the relationship between eta(pos) and T revealed that the minimal T required so that eta(pos) reached its minimal value is 0.59 s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to undertake repeat measurements of flow-mediated dilatation (FMD) within a short time of a previous measurement would be useful to improve accuracy or to repeat a failed initial procedure. Although standard methods report that a minimum of 10 min is required between measurements, there is no published data to support this. Thirty healthy volunteers had five FMD measurements performed within a 2-h period, separated by various time intervals (5, 15 and 30 min). In 19 volunteers, FMD was also performed as soon as the vessel had returned to its baseline diameter. There was no significant difference between any of the FMD measurements or parameters across the visits indicating that repeat measurements may be taken after a minimum of 5 min or as soon as the vessel has returned to its baseline diameter, which in some subjects may be less than 5 min.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We considered prediction techniques based on models of accelerated failure time with random e ects for correlated survival data. Besides the bayesian approach through empirical Bayes estimator, we also discussed about the use of a classical predictor, the Empirical Best Linear Unbiased Predictor (EBLUP). In order to illustrate the use of these predictors, we considered applications on a real data set coming from the oil industry. More speci - cally, the data set involves the mean time between failure of petroleum-well equipments of the Bacia Potiguar. The goal of this study is to predict the risk/probability of failure in order to help a preventive maintenance program. The results show that both methods are suitable to predict future failures, providing good decisions in relation to employment and economy of resources for preventive maintenance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundacao de Amparo a Pesquisa do Estado de sao Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On the Limits of Greenwich Mean Time, or The Failure of a Modernist Revolution From the introduction of World Standard Time in 1884 to Einstein’s theory of relativity, the nature and regulation of time was a highly contested issue in modernism, with profound political, social and epistemological consequences. Modernist aesthetic sensibilities widely revolted against the increasingly strict rule of the clock, which, as Georg Simmel observed in “The Metropolis and Mental Life,” was established as the necessary basis of a capitalist, urban life. This paper will focus on the contending conceptions of time arising in key modernist texts by authors like Joyce, Woolf and Conrad. I will argue that the uniformity and regularity of time necessary to a rising capitalist society came under attack in a similar way by both modernist literary aesthetics and new scientific discoveries. However, while Einstein’s theory of relativity may have led to a subsequent change of paradigm in scientific thought, it has failed to significantly alter social and popular conceptions of time. Although alternative ways of thinking and living with time are proposed by modernist authors, they remain isolated aesthetic experiments, ineffectual against the regulatory pressure of economic and social structures. In this struggle about the nature of time, so I suggest, science and literature join force against a society that is increasingly governed by economic reason. The fact that they lost this struggle can serve as a striking illustration of an increasing shift of social influence from science and art towards economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Calcareous nannoplankton assemblages and benthic d18O isotopes of Pliocene deep-sea sediments of ODP site 1172 (East of Tasmania) have been studied to improve our knowledge of the Southern Ocean paleoceanography. Our study site is located just north of the Subtropical Front (STF), an ideal setting to monitor migrations of the STF during our study period, between 3.45 and 2.45 Ma. The assemblage identified at ODP site 1172 has been interpreted as characteristic for the transitional zone water mass, located south of the STF, based on: (i) the low abundances (< 1%) of subtropical taxa, (ii) relatively high percentages of Coccolithus pelagicus, a subpolar type species, (iii) abundances from 2-10% of Calcidiscus leptoporus, a species that frequently inhabits the zone south of the STF and (iv) the high abundances of small Noelaerhabdaceae which at present dominates the zone south of the STF. Across our interval the calcareous nannoplankton manifests glacial-interglacial variability. We have identified cold events, characterized by high abundances of C. pelagicus which coincide with glacial periods, except during G7. After 3.1 Ma cold events are more frequent, in concordance with global cooling trends. Around 2.75 Ma, the interglacial stage G7 is characterized by anomalous low temperatures which most likely are linked to definite closure of the Central American Seaway (CAS), an event that is believed to have had global consequences. A gradual increase of very small Reticulofenestra across our section marks a significant trend in the small Noelaerhabdaceae species group and has been linked to a general enhanced mixing of the water column in agreement with previous studies. It is suggested that a rapid decline of small Gephyrocapsa after isotopic stage G7 might be related to the cooling observed in our study site after the closure of the CAS.