972 resultados para LIFETIME DATA
Resumo:
In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.
Resumo:
PURPOSE The purpose of this study was to describe autofluorescence lifetime characteristics in Stargardt disease (STGD) using fluorescence lifetime imaging ophthalmoscopy (FLIO) and to investigate potential prognostic markers for disease activity and progression. METHODS Fluorescence lifetime data of 16 patients with STGD (mean age, 40 years; range, 22-56 years) and 15 age-matched controls were acquired using a fluorescence lifetime imaging ophthalmoscope based on a Heidelberg Engineering Spectralis system. Autofluorescence was excited with a 473-nm laser, and decay times were measured in a short (498-560 nm) and long (560-720 nm) spectral channel. Clinical features, autofluorescence lifetimes and intensity, and corresponding optical coherence tomography images were analyzed. One-year follow-up examination was performed in eight STGD patients. Acquired data were correlated with in vitro measured decay times of all-trans retinal and N-retinylidene-N-retinylethanolamine. RESULTS Patients with STGD displayed characteristic autofluorescence lifetimes within yellow flecks (446 ps) compared with 297 ps in unaffected areas. In 15% of the STGD eyes, some flecks showed very short fluorescence lifetimes (242 ps). Atrophic areas were characterized by long lifetimes (474 ps), with some remaining areas of normal to short lifetimes (322 ps) toward the macular center. CONCLUSIONS Patients with recent disease onset showed flecks with very short autofluorescence lifetimes, which is possible evidence of accumulation of retinoids deriving from the visual cycle. During the study period, many of these flecks changed to longer lifetimes, possibly due to accumulation of lipofuscin. Therefore, FLIO might serve as a useful tool for monitoring of disease progression. (ClinicalTrials.gov number, NCT01981148.).
Resumo:
The purpose of this study was to correct some mistakes in the literature and derive a necessary and sufficient condition for the MRL to follow the roller-coaster pattern of the corresponding failure rate function. It was also desired to find the conditions under which the discrete failure rate function has an upside-down bathtub shape if corresponding MRL function has a bathtub shape. The study showed that if discrete MRL has a bathtub shape, then under some conditions the corresponding failure rate function has an upside-down bathtub shape. Also the study corrected some mistakes in proofs of Tang, Lu and Chew (1999) and established a necessary and sufficient condition for the MRL to follow the roller-coaster pattern of the corresponding failure rate function. Similarly, some mistakes in Gupta and Gupta (2000) are corrected, with the ensuing results being expanded and proved thoroughly to establish the relationship between the crossing points of the failure rate and associated MRL functions. The new results derived in this study will be useful to model various lifetime data that occur in environmental studies, medical research, electronics engineering, and in many other areas of science and technology.
Resumo:
A class of lifetime distributions which has received considerable attention in modelling and analysis of lifetime data is the class of lifetime distributions with bath-tub shaped failure rate functions because of their extensive applications. The purpose of this thesis was to introduce a new class of bivariate lifetime distributions with bath-tub shaped failure rates (BTFRFs). In this research, first we reviewed univariate lifetime distributions with bath-tub shaped failure rates, and several multivariate extensions of a univariate failure rate function. Then we introduced a new class of bivariate distributions with bath-tub shaped failure rates (hazard gradients). Specifically, the new class of bivariate lifetime distributions were developed using the method of Morgenstern’s method of defining bivariate class of distributions with given marginals. The computer simulations and numerical computations were used to investigate the properties of these distributions.
Resumo:
Hazard and reliability prediction of an engineering asset is one of the significant fields of research in Engineering Asset Health Management (EAHM). In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset can be influenced and/or indicated by different factors that are termed as covariates. The Explicit Hazard Model (EHM) as a covariate-based hazard model is a new approach for hazard prediction which explicitly incorporates both internal and external covariates into one model. EHM is an appropriate model to use in the analysis of lifetime data in presence of both internal and external covariates in the reliability field. This paper presents applications of the methodology which is introduced and illustrated in the theory part of this study. In this paper, the semi-parametric EHM is applied to a case study so as to predict the hazard and reliability of resistance elements on a Resistance Corrosion Sensor Board (RCSB).
Resumo:
Background Exercise referral schemes (ERS) aim to identify inactive adults in the primary care setting. The primary care professional refers the patient to a third party service, with this service taking responsibility for prescribing and monitoring an exercise programme tailored to the needs of the patient. This paper examines the cost-effectiveness of ERS in promoting physical activity compared with usual care in primary care setting. Methods A decision analytic model was developed to estimate the cost-effectiveness of ERS from a UK NHS perspective. The costs and outcomes of ERS were modelled over the patient's lifetime. Data were derived from a systematic review of the literature on the clinical and cost-effectiveness of ERS, and on parameter inputs in the modelling framework. Outcomes were expressed as incremental cost per quality-adjusted life-year (QALY). Deterministic and probabilistic sensitivity analyses investigated the impact of varying ERS cost and effectiveness assumptions. Sub-group analyses explored the cost-effectiveness of ERS in sedentary people with an underlying condition. Results Compared with usual care, the mean incremental lifetime cost per patient for ERS was £169 and the mean incremental QALY was 0.008, generating a base-case incremental cost-effectiveness ratio (ICER) for ERS at £20,876 per QALY in sedentary individuals without a diagnosed medical condition. There was a 51% probability that ERS was cost-effective at £20,000 per QALY and 88% probability that ERS was cost-effective at £30,000 per QALY. In sub-group analyses, cost per QALY for ERS in sedentary obese individuals was £14,618, and in sedentary hypertensives and sedentary individuals with depression the estimated cost per QALY was £12,834 and £8,414 respectively. Incremental lifetime costs and benefits associated with ERS were small, reflecting the preventative public health context of the intervention, with this resulting in estimates of cost-effectiveness that are sensitive to variations in the relative risk of becoming physically active and cost of ERS. Conclusions ERS is associated with modest increase in lifetime costs and benefits. The cost-effectiveness of ERS is highly sensitive to small changes in the effectiveness and cost of ERS and is subject to some significant uncertainty mainly due to limitations in the clinical effectiveness evidence base.
Resumo:
The recombination properties of cobalt centers in p-type germanium containing cobalt in the concentration range 1014 to 1016 atoms/cm3 have been investigated. The measurement of lifetime has been carried out by steady-state photoconductivity and photo-magneto-electric methods in the temperature range 145 to 300°K. The cross-sections Sno (electron capture cross-section at neutral centers). Sn- (electron capture cross-section at singly negatively charged centers) and their temperature variations have been estimated by the analysis of the lifetime data on the basis of Sah-Shockley's multi-level formula. The value of Sno is (15±5).10-16 cm2 and is temperature independent. The value of Sn- is ≈4·10-16 cm2 around 225°K and it increases with increase of temperature. The possible mechanisms for capture at neutral and repulsive centers are discussed and a summary of the capture cross-sections for cobalt centers is given. A comparison of the cross-section values of cobalt and their temperature variations with those of the related impurities-manganese, iron and nickel-in germanium has been made.
Resumo:
We present a robust Dirichlet process for estimating survival functions from samples with right-censored data. It adopts a prior near-ignorance approach to avoid almost any assumption about the distribution of the population lifetimes, as well as the need of eliciting an infinite dimensional parameter (in case of lack of prior information), as it happens with the usual Dirichlet process prior. We show how such model can be used to derive robust inferences from right-censored lifetime data. Robustness is due to the identification of the decisions that are prior-dependent, and can be interpreted as an analysis of sensitivity with respect to the hypothetical inclusion of fictitious new samples in the data. In particular, we derive a nonparametric estimator of the survival probability and a hypothesis test about the probability that the lifetime of an individual from one population is shorter than the lifetime of an individual from another. We evaluate these ideas on simulated data and on the Australian AIDS survival dataset. The methods are publicly available through an easy-to-use R package.
Resumo:
In the present environment, industry should provide the products of high quality. Quality of products is judged by the period of time they can successfully perform their intended functions without failure. The cause of the failures can be ascertained through life testing experiments and the times to failure due to different cause are likely to follow different distributions. Knowledge of this distribution is essential to eliminate causes of failures and thereby to improve the quality and the reliability of products. The main accomplishment expected to the study is to develop statistical tools that could facilitate solution to lifetime data arising in such and similar contexts
Resumo:
So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).
Resumo:
Partial moments are extensively used in literature for modeling and analysis of lifetime data. In this paper, we study properties of partial moments using quantile functions. The quantile based measure determines the underlying distribution uniquely. We then characterize certain lifetime quantile function models. The proposed measure provides alternate definitions for ageing criteria. Finally, we explore the utility of the measure to compare the characteristics of two lifetime distributions
Resumo:
In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow a compound weighted Poisson distribution. This model is more flexible in terms of dispersion than the promotion time cure model. Moreover, it gives an interesting and realistic interpretation of the biological mechanism of the occurrence of event of interest as it includes a destructive process of the initial risk factors in a competitive scenario. In other words, what is recorded is only from the undamaged portion of the original number of risk factors.
Resumo:
In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.
Resumo:
The generalized Birnbaum-Saunders distribution pertains to a class of lifetime models including both lighter and heavier tailed distributions. This model adapts well to lifetime data, even when outliers exist, and has other good theoretical properties and application perspectives. However, statistical inference tools may not exist in closed form for this model. Hence, simulation and numerical studies are needed, which require a random number generator. Three different ways to generate observations from this model are considered here. These generators are compared by utilizing a goodness-of-fit procedure as well as their effectiveness in predicting the true parameter values by using Monte Carlo simulations. This goodness-of-fit procedure may also be used as an estimation method. The quality of this estimation method is studied here. Finally, through a real data set, the generalized and classical Birnbaum-Saunders models are compared by using this estimation method.
Resumo:
In this article, we compare three residuals based on the deviance component in generalised log-gamma regression models with censored observations. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. For all cases studied, the empirical distributions of the proposed residuals are in general symmetric around zero, but only a martingale-type residual presented negligible kurtosis for the majority of the cases studied. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for the martingale-type residual in generalised log-gamma regression models with censored data. A lifetime data set is analysed under log-gamma regression models and a model checking based on the martingale-type residual is performed.