969 resultados para Reliability, Failure Distribution Function, Hazard Rate, Exponential Distribution
Resumo:
So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).
Resumo:
The present work is intended to discuss various properties and reliability aspects of higher order equilibrium distributions in continuous, discrete and multivariate cases, which contribute to the study on equilibrium distributions. At first, we have to study and consolidate the existing literature on equilibrium distributions. For this we need some basic concepts in reliability. These are being discussed in the 2nd chapter, In Chapter 3, some identities connecting the failure rate functions and moments of residual life of the univariate, non-negative continuous equilibrium distributions of higher order and that of the baseline distribution are derived. These identities are then used to characterize the generalized Pareto model, mixture of exponentials and gamma distribution. An approach using the characteristic functions is also discussed with illustrations. Moreover, characterizations of ageing classes using stochastic orders has been discussed. Part of the results of this chapter has been reported in Nair and Preeth (2009). Various properties of equilibrium distributions of non-negative discrete univariate random variables are discussed in Chapter 4. Then some characterizations of the geo- metric, Waring and negative hyper-geometric distributions are presented. Moreover, the ageing properties of the original distribution and nth order equilibrium distribu- tions are compared. Part of the results of this chapter have been reported in Nair, Sankaran and Preeth (2012). Chapter 5 is a continuation of Chapter 4. Here, several conditions, in terms of stochastic orders connecting the baseline and its equilibrium distributions are derived. These conditions can be used to rede_ne certain ageing notions. Then equilibrium distributions of two random variables are compared in terms of various stochastic orders that have implications in reliability applications. In Chapter 6, we make two approaches to de_ne multivariate equilibrium distribu- tions of order n. Then various properties including characterizations of higher order equilibrium distributions are presented. Part of the results of this chapter have been reported in Nair and Preeth (2008). The Thesis is concluded in Chapter 7. A discussion on further studies on equilib- rium distributions is also made in this chapter.
Resumo:
This paper presents a new methodology to evaluate in a predictive way the reliability of distribution systems, considering the impact of automatic recloser switches. The developed algorithm is based on state enumeration techniques with Markovian models and on the minimal cut set theory. Some computational aspects related with the implementation of the proposed algorithm in typical distribution networks are also discussed. The description of the proposed approach is carried out using a sample test system. The results obtained with a typical configuration of a Brazilian system (EDP Bandeirante Energia S.A.) are presented and discussed.
Resumo:
The system reliability depends on the reliability of its components itself. Therefore, it is necessary a methodology capable of inferring the state of functionality of these components to establish reliable indices of quality. Allocation models for maintenance and protective devices, among others, have been used in order to improve the quality and availability of services on electric power distribution systems. This paper proposes a methodology for assessing the reliability of distribution system components in an integrated way, using probabilistic models and fuzzy inference systems to infer about the operation probability of each component. © 2012 IEEE.
Resumo:
It is of interest in some applications to determine whether there is a relationship between a hazard rate function (or a cumulative incidence function) and a mark variable which is only observed at uncensored failure times. We develop nonparametric tests for this problem when the mark variable is continuous. Tests are developed for the null hypothesis that the mark-specific hazard rate is independent of the mark versus ordered and two-sided alternatives expressed in terms of mark-specific hazard functions and mark-specific cumulative incidence functions. The test statistics are based on functionals of a bivariate test process equal to a weighted average of differences between a Nelson--Aalen-type estimator of the mark-specific cumulative hazard function and a nonparametric estimator of this function under the null hypothesis. The weight function in the test process can be chosen so that the test statistics are asymptotically distribution-free.Asymptotically correct critical values are obtained through a simple simulation procedure. The testing procedures are shown to perform well in numerical studies, and are illustrated with an AIDS clinical trial example. Specifically, the tests are used to assess if the instantaneous or absolute risk of treatment failure depends on the amount of accumulation of drug resistance mutations in a subject's HIV virus. This assessment helps guide development of anti-HIV therapies that surmount the problem of drug resistance.
Resumo:
El estudio de la fiabilidad de componentes y sistemas tiene gran importancia en diversos campos de la ingenieria, y muy concretamente en el de la informatica. Al analizar la duracion de los elementos de la muestra hay que tener en cuenta los elementos que no fallan en el tiempo que dure el experimento, o bien los que fallen por causas distintas a la que es objeto de estudio. Por ello surgen nuevos tipos de muestreo que contemplan estos casos. El mas general de ellos, el muestreo censurado, es el que consideramos en nuestro trabajo. En este muestreo tanto el tiempo hasta que falla el componente como el tiempo de censura son variables aleatorias. Con la hipotesis de que ambos tiempos se distribuyen exponencialmente, el profesor Hurt estudio el comportamiento asintotico del estimador de maxima verosimilitud de la funcion de fiabilidad. En principio parece interesante utilizar metodos Bayesianos en el estudio de la fiabilidad porque incorporan al analisis la informacion a priori de la que se dispone normalmente en problemas reales. Por ello hemos considerado dos estimadores Bayesianos de la fiabilidad de una distribucion exponencial que son la media y la moda de la distribucion a posteriori. Hemos calculado la expansion asint6tica de la media, varianza y error cuadratico medio de ambos estimadores cuando la distribuci6n de censura es exponencial. Hemos obtenido tambien la distribucion asintotica de los estimadores para el caso m3s general de que la distribucion de censura sea de Weibull. Dos tipos de intervalos de confianza para muestras grandes se han propuesto para cada estimador. Los resultados se han comparado con los del estimador de maxima verosimilitud, y con los de dos estimadores no parametricos: limite producto y Bayesiano, resultando un comportamiento superior por parte de uno de nuestros estimadores. Finalmente nemos comprobado mediante simulacion que nuestros estimadores son robustos frente a la supuesta distribuci6n de censura, y que uno de los intervalos de confianza propuestos es valido con muestras pequenas. Este estudio ha servido tambien para confirmar el mejor comportamiento de uno de nuestros estimadores. SETTING OUT AND SUMMARY OF THE THESIS When we study the lifetime of components it's necessary to take into account the elements that don't fail during the experiment, or those that fail by reasons which are desirable to exclude from consideration. The model of random censorship is very usefull for analysing these data. In this model the time to failure and the time censor are random variables. We obtain two Bayes estimators of the reliability function of an exponential distribution based on randomly censored data. We have calculated the asymptotic expansion of the mean, variance and mean square error of both estimators, when the censor's distribution is exponential. We have obtained also the asymptotic distribution of the estimators for the more general case of censor's Weibull distribution. Two large-sample confidence bands have been proposed for each estimator. The results have been compared with those of the maximum likelihood estimator, and with those of two non parametric estimators: Product-limit and Bayesian. One of our estimators has the best behaviour. Finally we have shown by simulation, that our estimators are robust against the assumed censor's distribution, and that one of our intervals does well in small sample situation.
Resumo:
This paper addresses the time-variant reliability analysis of structures with random resistance or random system parameters. It deals with the problem of a random load process crossing a random barrier level. The implications of approximating the arrival rate of the first overload by an ensemble-crossing rate are studied. The error involved in this so-called ""ensemble-crossing rate"" approximation is described in terms of load process and barrier distribution parameters, and in terms of the number of load cycles. Existing results are reviewed, and significant improvements involving load process bandwidth, mean-crossing frequency and time are presented. The paper shows that the ensemble-crossing rate approximation can be accurate enough for problems where load process variance is large in comparison to barrier variance, but especially when the number of load cycles is small. This includes important practical applications like random vibration due to impact loadings and earthquake loading. Two application examples are presented, one involving earthquake loading and one involving a frame structure subject to wind and snow loadings. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Despite extensive efforts to confirm a direct association between Chlamydia pneumoniae and atherosclerosis, different laboratories continue to report a large variability in detection rates. In this study, we analyzed multiple sections from atherosclerotic carotid arteries from 10 endartectomy patients to determine the location of C. pneumoniae DNA and the number of sections of the plaque required for analysis to obtain a 95% confidence of detecting the bacterium. A sensitive nested PCR assay detected C. pneumoniae DNA in all patients at one or more locations within the plaque. On average, 42% (ranging from 5 to 91%) of the sections from any single patient had C. pneumoniae DNA present. A patchy distribution of C. pneumoniae in the atherosclerotic lesions was observed, with no area of the carotid having significantly more C. pneumoniae DNA present. If a single random 30-mum-thick section was tested, there was only a 35.6 to 41.6% (95% confidence interval) chance of detecting C. pneumoniae DNA in a patient with carotid artery disease. A minimum of 15 sections would therefore be required to obtain a 95% chance of detecting all true positives. The low concentration and patchy distribution of C. pneumoniae DNA in atherosclerotic plaque appear to be among the reasons for inconsistency between laboratories in the results reported.
Resumo:
Background: Chronic obstructive pulmonary disease (COPD) has been associated with increased risk for heart failure (HF). The impact of subclinical abnormal spirometric findings on HF risk among older adults without history of COPD is not well elucidated. Methods: We evaluated 2125 participants (age 73.6±2.9 years; 50.5% men; 62.3% white; 45.6/9.4% past/current smokers; body mass index [BMI] 27.2±4.6 kg/m2) without prevalent COPD or HF who underwent baseline spirometry in the Health ABC Study. Abnormal lung function was defined either as forced vital capacity (FVC) below lower limit of normal (LLN) or forced expiratory volume in 1st sec (FEV1) to FVC ratio below LLN. Results: On follow-up (median, 9.4 years), 68 of 350 (19.4%) participants with abnormal lung function developed HF, as compared to 172 of 1775 (9.7%) participants with normal lung function (hazard ratio [HR], 2.31; 95% confidence interval [CI], 1.74 -3.06; P<.001). This increased risk persisted after adjusting for all other independent predictors of HF in the Health ABC Study, BMI, incident coronary events, and several inflammatory markers (HR, 1.82; 95% CI, 1.30 -2.54; P<.001), and remained constant over time. Baseline FVC and FEV1 had a linear association with HF risk (Figure). In adjusted models, HF risk increased by 21% (95% CI, 10 -36%) per 10% decrease in FVC and 18% (95% CI, 10 -28%) per 10% decrease in FEV1 (both P<.001); this association persisted among participants with normal lung function at baseline. Findings were consistent across sex, race, and smoking status. Conclusions: Subclinical abnormal spirometric findings are prevalent among older adults and are independently associated with risk for incident HF.
Resumo:
OBJECTIVES: The goal of this study was to determine whether subclinical thyroid dysfunction was associated with incident heart failure (HF) and echocardiogram abnormalities. BACKGROUND: Subclinical hypothyroidism and hyperthyroidism have been associated with cardiac dysfunction. However, long-term data on the risk of HF are limited. METHODS: We studied 3,044 adults>or=65 years of age who initially were free of HF in the Cardiovascular Health Study. We compared adjudicated HF events over a mean 12-year follow-up and changes in cardiac function over the course of 5 years among euthyroid participants, those with subclinical hypothyroidism (subdivided by thyroid-stimulating hormone [TSH] levels: 4.5 to 9.9, >or=10.0 mU/l), and those with subclinical hyperthyroidism. RESULTS: Over the course of 12 years, 736 participants developed HF events. Participants with TSH>or=10.0 mU/l had a greater incidence of HF compared with euthyroid participants (41.7 vs. 22.9 per 1,000 person years, p=0.01; adjusted hazard ratio: 1.88; 95% confidence interval: 1.05 to 3.34). Baseline peak E velocity, which is an echocardiographic measurement of diastolic function associated with incident HF in the CHS cohort, was greater in those patients with TSH>or=10.0 mU/l compared with euthyroid participants (0.80 m/s vs. 0.72 m/s, p=0.002). Over the course of 5 years, left ventricular mass increased among those with TSH>or=10.0 mU/l, but other echocardiographic measurements were unchanged. Those patients with TSH 4.5 to 9.9 mU/l or with subclinical hyperthyroidism had no increase in risk of HF. CONCLUSIONS: Compared with euthyroid older adults, those adults with TSH>or=10.0 mU/l have a moderately increased risk of HF and alterations in cardiac function but not older adults with TSH<10.0 mU/l. Clinical trials should assess whether the risk of HF might be ameliorated by thyroxine replacement in individuals with TSH>or=10.0 mU/l.
Resumo:
BACKGROUND: The impact of abnormal spirometric findings on risk for incident heart failure among older adults without clinically apparent lung disease is not well elucidated.METHODS: We evaluated the association of baseline lung function with incident heart failure, defined as first hospitalization for heart failure, in 2125 participants of the community-based Health, Aging, and Body Composition (Health ABC) Study (age, 73.6 +/- 2.9 years; 50.5% men; 62.3% white; 37.7% black) without prevalent lung disease or heart failure. Abnormal lung function was defined either as forced vital capacity (FVC) or forced expiratory volume in 1(st) second (FEV1) to FVC ratio below lower limit of normal. Percent predicted FVC and FEV1 also were assessed as continuous variables.RESULTS: During follow-up (median, 9.4 years), heart failure developed in 68 of 350 (19.4%) participants with abnormal baseline lung function, as compared with 172 of 1775 (9.7%) participants with normal lung function (hazard ratio [HR] 2.31; 95% confidence interval [CI], 1.74-3.07; P <.001). This increased risk persisted after adjusting for previously identified heart failure risk factors in the Health ABC Study, body mass index, incident coronary heart disease, and inflammatory markers (HR 1.83; 95% CI, 1.33-2.50; P <.001). Percent predicted (%) FVC and FEV 1 had a linear association with heart failure risk (HR 1.21; 95% CI, 1.11-1.32 and 1.18; 95% CI, 1.10-1.26, per 10% lower % FVC and % FEV1, respectively; both P <.001 in fully adjusted models). Findings were consistent in sex and race subgroups and for heart failure with preserved or reduced ejection fraction.CONCLUSIONS: Abnormal spirometric findings in older adults without clinical lung disease are associated with increased heart failure risk. (C) 2011 Elsevier Inc. All rights reserved. The American Journal of Medicine (2011) 124, 334-341
Resumo:
Sähkönsiirtoyritysten kunnossapidon taloudellinen malli eli SKUTMA, on sähköverkkoyhtiöille suunniteltu luotettavuuspohjainen kunnossapitomalli, mikä priorisoi ja ajoittaa sähkönjakeluverkon komponenttien huolto- ja investointiajankohdat. Malli hyödyntää dynaamisen optimoinnin algoritmia kustannusminimien löytämiseksi tarkastelujaksolta ja simuloi komponenttien rappeutumasta rappeutumismallin avulla. Tässä diplomityössä on kehitetty kunnossapito-ohjelma SKUTMA-mallin pohjalta, minkä avulla tutkitaan mallin toimivuutta oikeilla johtolähdöillä ja sen hyödyntämistä sähköverkkojen kunnossapidon suunnittelussa. Työssä käydään läpi myös kunnossapitoohjelman laskenta metodiikkaa ja sen ominaisuuksia. Tämän työn lopputuloksena saadaan selkeä kuva mallin toiminnasta, käytettävyydestä ja jatkokehityspotentiaalista.
Resumo:
In this paper, we proposed a flexible cure rate survival model by assuming the number of competing causes of the event of interest following the Conway-Maxwell distribution and the time for the event to follow the generalized gamma distribution. This distribution can be used to model survival data when the hazard rate function is increasing, decreasing, bathtub and unimodal-shaped including some distributions commonly used in lifetime analysis as particular cases. Some appropriate matrices are derived in order to evaluate local influence on the estimates of the parameters by considering different perturbations, and some global influence measurements are also investigated. Finally, data set from the medical area is analysed.
Resumo:
Among the traits of economic importance to dairy cattle livestock those related to sexual precocity and longevity of the herd are essential to the success of the activity, because the stayability time of a cow in a herd is determined by their productive and reproductive lives. In Brazil, there are few studies about the reproductive efficiency of Swiss-Brown cows and no study was found using the methodology of survival analysis applied to this breed. Thus, in the first chapter of this study, the age at first calving from Swiss-Brown heifers was analyzed as the time until the event by the nonparametric method of Kaplan-Meier and the gamma shared frailty model, under the survival analysis methodology. Survival and hazard rate curves associated with this event were estimated and identified the influence of covariates on such time. The mean and median times at the first calving were 987.77 and 1,003 days, respectively, and significant covariates by the Log-Rank test, through Kaplan-Meier analysis, were birth season, calving year, sire (cow s father) and calving season. In the analysis by frailty model, the breeding values and the frailties of the sires (fathers) for the calving were predicted modeling the risk function of each cow as a function of the birth season as fixed covariate and sire as random covariate. The frailty followed the gamma distribution. Sires with high and positive breeding values possess high frailties, what means shorter survival time of their daughters to the event, i.e., reduction in the age at first calving of them. The second chapter aimed to evaluate the longevity of dairy cows using the nonparametric Kaplan-Meier and the Cox and Weibull proportional hazards models. It were simulated 10,000 records of the longevity trait from Brown-Swiss cows involving their respective times until the occurrence of five consecutive calvings (event), considered here as typical of a long-lived cow. The covariates considered in the database were age at first calving, herd and sire (cow s father). All covariates had influence on the longevity of cows by Log-Rank and Wilcoxon tests. The mean and median times to the occurrence of the event were 2,436.285 and 2,437 days, respectively. Sires that have higher breeding values also have a greater risk of that their daughters reach the five consecutive calvings until 84 months
Resumo:
This work proposes a methodology for optimized allocation of switches for automatic load transfer in distribution systems in order to improve the reliability indexes by restoring such systems which present voltage classes of 23 to 35 kV and radial topology. The automatic switches must be allocated on the system in order to transfer load remotely among the sources at the substations. The problem of switch allocation is formulated as nonlinear constrained mixed integer programming model subject to a set of economical and physical constraints. A dedicated Tabu Search (TS) algorithm is proposed to solve this model. The proposed methodology is tested for a large real-life distribution system. © 2011 IEEE.