952 resultados para log-ratio analysis
Resumo:
BACKGROUND Studies that systematically assess change in ulcerative colitis (UC) extent over time in adult patients are scarce. AIM To assess changes in disease extent over time and to evaluate clinical parameters associated with this change. METHODS Data from the Swiss IBD cohort study were analysed. We used logistic regression modelling to identify factors associated with a change in disease extent. RESULTS A total of 918 UC patients (45.3% females) were included. At diagnosis, UC patients presented with the following disease extent: proctitis [199 patients (21.7%)], left-sided colitis [338 patients (36.8%)] and extensive colitis/pancolitis [381 (41.5%)]. During a median disease duration of 9 [4-16] years, progression and regression was documented in 145 patients (15.8%) and 149 patients (16.2%) respectively. In addition, 624 patients (68.0%) had a stable disease extent. The following factors were identified to be associated with disease progression: treatment with systemic glucocorticoids [odds ratio (OR) 1.704, P = 0.025] and calcineurin inhibitors (OR: 2.716, P = 0.005). No specific factors were found to be associated with disease regression. CONCLUSIONS Over a median disease duration of 9 [4-16] years, about two-thirds of UC patients maintained the initial disease extent; the remaining one-third had experienced either progression or regression of the disease extent.
Resumo:
PURPOSE Lymphangioleiomyomatosis (LAM) is characterized by proliferation of smooth muscle tissue that causes bronchial obstruction and secondary cystic destruction of lung parenchyma. The aim of this study was to evaluate the typical distribution of cystic defects in LAM with quantitative volumetric chest computed tomography (CT). MATERIALS AND METHODS CT examinations of 20 patients with confirmed LAM were evaluated with region-based quantification of lung parenchyma. Additionally, 10 consecutive patients were identified who had recently undergone CT imaging of the lung at our institution, in which no pathologies of the lung were found, to serve as a control group. Each lung was divided into three regions (upper, middle and lower thirds) with identical number of slices. In addition, we defined a "peel" and "core" of the lung comprising the 2 cm subpleural space and the remaining inner lung area. Computerized detection of lung volume and relative emphysema was performed with the PULMO 3D software (v3.42, Fraunhofer MEVIS, Bremen, Germany). This software package enables the quantification of emphysematous lung parenchyma by calculating the pixel index, which is defined as the ratio of lung voxels with a density <-950HU to the total number of voxels in the lung. RESULTS Cystic changes accounted for 0.1-39.1% of the total lung volume in patients with LAM. Disease manifestation in the central lung was significantly higher than in peripheral areas (peel median: 15.1%, core median: 20.5%; p=0.001). Lower thirds of lung parenchyma showed significantly less cystic changes than upper and middle lung areas combined (lower third: median 13.4, upper and middle thirds: median 19.0, p=0.001). CONCLUSION The distribution of cystic lesions in LAM is significantly more pronounced in the central lung compared to peripheral areas. There is a significant predominance of cystic changes in apical and intermediate lung zones compared to the lung bases.
Resumo:
OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.
Resumo:
OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.
Resumo:
BACKGROUND Survival and success rates of tooth transplantations even after long follow-up periods have been shown to be very high. Nevertheless, it is important to analyse factors potentially influencing these rates. The aim of this study was to assess the influence on success of potential factors. METHODS The research was based on a retrospective analysis of clinical and radiological data from a sample of 59 subjects (75 transplanted teeth). The follow-up period varied from 0.44 to 12.28 years (mean 3.95 years). Success rates were calculated and depicted with Kaplan-Meier plots. Log-rank tests were used to analyse the effect of root development stage, apex width, the use of enamel matrix proteins or the surgeon on success of transplantations. RESULTS Results for success of premolar transplantations were comparable with already published data, while molars performed worse than shown in other studies. The surgeon performing the transplantation (p = 0.001) and tooth type (p ≤ 0.001) were significantly associated with transplantation success. Use of enamel matrix proteins (p = 0.10), root development stage (p = 0.13), the recipient area (p = 0.48) and apex width (p = 0.59) were not significantly associated with success. CONCLUSIONS Molar transplantations were not as successful as premolar transplantations; however, success rates varied greatly depending on the surgeon's experience. The use of enamel matrix proteins as well as root development stage, the recipient area and apex width did not show significant associations with success of tooth transplantations.
Resumo:
BACKGROUND Children born preterm or with a small size for gestational age are at increased risk for childhood asthma. OBJECTIVE We sought to assess the hypothesis that these associations are explained by reduced airway patency. METHODS We used individual participant data of 24,938 children from 24 birth cohorts to examine and meta-analyze the associations of gestational age, size for gestational age, and infant weight gain with childhood lung function and asthma (age range, 3.9-19.1 years). Second, we explored whether these lung function outcomes mediated the associations of early growth characteristics with childhood asthma. RESULTS Children born with a younger gestational age had a lower FEV1, FEV1/forced vital capacity (FVC) ratio, and forced expiratory volume after exhaling 75% of vital capacity (FEF75), whereas those born with a smaller size for gestational age at birth had a lower FEV1 but higher FEV1/FVC ratio (P < .05). Greater infant weight gain was associated with higher FEV1 but lower FEV1/FVC ratio and FEF75 in childhood (P < .05). All associations were present across the full range and independent of other early-life growth characteristics. Preterm birth, low birth weight, and greater infant weight gain were associated with an increased risk of childhood asthma (pooled odds ratio, 1.34 [95% CI, 1.15-1.57], 1.32 [95% CI, 1.07-1.62], and 1.27 [95% CI, 1.21-1.34], respectively). Mediation analyses suggested that FEV1, FEV1/FVC ratio, and FEF75 might explain 7% (95% CI, 2% to 10%) to 45% (95% CI, 15% to 81%) of the associations between early growth characteristics and asthma. CONCLUSIONS Younger gestational age, smaller size for gestational age, and greater infant weight gain were across the full ranges associated with childhood lung function. These associations explain the risk of childhood asthma to a substantial extent.
Resumo:
In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS) to a preconcentration unit, called trace gas extractor (TREX). This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, µmole mole−1) methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on repeated measurements of compressed air during a 2-week intercomparison campaign, the repeatability of the TREX–QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass spectrometry (IRMS) based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. This also displays the potential to improve the interlaboratory compatibility based on the analysis of a reference air sample with accurately determined isotopic composition.
Resumo:
BACKGROUND Non-steroidal anti-inflammatory drugs (NSAIDs) are the backbone of osteoarthritis pain management. We aimed to assess the effectiveness of different preparations and doses of NSAIDs on osteoarthritis pain in a network meta-analysis. METHODS For this network meta-analysis, we considered randomised trials comparing any of the following interventions: NSAIDs, paracetamol, or placebo, for the treatment of osteoarthritis pain. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the reference lists of relevant articles for trials published between Jan 1, 1980, and Feb 24, 2015, with at least 100 patients per group. The prespecified primary and secondary outcomes were pain and physical function, and were extracted in duplicate for up to seven timepoints after the start of treatment. We used an extension of multivariable Bayesian random effects models for mixed multiple treatment comparisons with a random effect at the level of trials. For the primary analysis, a random walk of first order was used to account for multiple follow-up outcome data within a trial. Preparations that used different total daily dose were considered separately in the analysis. To assess a potential dose-response relation, we used preparation-specific covariates assuming linearity on log relative dose. FINDINGS We identified 8973 manuscripts from our search, of which 74 randomised trials with a total of 58 556 patients were included in this analysis. 23 nodes concerning seven different NSAIDs or paracetamol with specific daily dose of administration or placebo were considered. All preparations, irrespective of dose, improved point estimates of pain symptoms when compared with placebo. For six interventions (diclofenac 150 mg/day, etoricoxib 30 mg/day, 60 mg/day, and 90 mg/day, and rofecoxib 25 mg/day and 50 mg/day), the probability that the difference to placebo is at or below a prespecified minimum clinically important effect for pain reduction (effect size [ES] -0·37) was at least 95%. Among maximally approved daily doses, diclofenac 150 mg/day (ES -0·57, 95% credibility interval [CrI] -0·69 to -0·46) and etoricoxib 60 mg/day (ES -0·58, -0·73 to -0·43) had the highest probability to be the best intervention, both with 100% probability to reach the minimum clinically important difference. Treatment effects increased as drug dose increased, but corresponding tests for a linear dose effect were significant only for celecoxib (p=0·030), diclofenac (p=0·031), and naproxen (p=0·026). We found no evidence that treatment effects varied over the duration of treatment. Model fit was good, and between-trial heterogeneity and inconsistency were low in all analyses. All trials were deemed to have a low risk of bias for blinding of patients. Effect estimates did not change in sensitivity analyses with two additional statistical models and accounting for methodological quality criteria in meta-regression analysis. INTERPRETATION On the basis of the available data, we see no role for single-agent paracetamol for the treatment of patients with osteoarthritis irrespective of dose. We provide sound evidence that diclofenac 150 mg/day is the most effective NSAID available at present, in terms of improving both pain and function. Nevertheless, in view of the safety profile of these drugs, physicians need to consider our results together with all known safety information when selecting the preparation and dose for individual patients. FUNDING Swiss National Science Foundation (grant number 405340-104762) and Arco Foundation, Switzerland.
Resumo:
BACKGROUND Evidence suggests that EMS-physician-guided cardiopulmonary resuscitation (CPR) in out-of-hospital cardiac arrest (OOHCA) may be associated with improved outcomes, yet randomized controlled trials are not available. The goal of this meta-analysis was to determine the association between EMS-physician- versus paramedic-guided CPR and survival after OOHCA. METHODS AND RESULTS Studies that compared EMS-physician- versus paramedic-guided CPR in OOHCA published until June 2014 were systematically searched in MEDLINE, EMBASE and Cochrane databases. All studies were required to contain survival data. Data on study characteristics, methods, and as well as survival outcomes were extracted. A random-effects model was used for the meta-analysis due to a high degree of heterogeneity among the studies (I (2) = 44 %). Return of spontaneous circulation [ROSC], survival to hospital admission, and survival to hospital discharge were the outcome measures. Out of 3,385 potentially eligible studies, 14 met the inclusion criteria. In the pooled analysis (n = 126,829), EMS-physician-guided CPR was associated with significantly improved outcomes compared to paramedic-guided CPR: ROSC 36.2 % (95 % confidence interval [CI] 31.0 - 41.7 %) vs. 23.4 % (95 % CI 18.5 - 29.2 %) (pooled odds ratio [OR] 1.89, 95 % CI 1.36 - 2.63, p < 0.001); survival to hospital admission 30.1 % (95 % CI 24.2 - 36.7 %) vs. 19.2 % (95 % CI 12.7 - 28.1 %) (pooled OR 1.78, 95 % CI 0.97 - 3.28, p = 0.06); and survival to discharge 15.1 % (95 % CI 14.6 - 15.7 %) vs. 8.4 % (95 % CI 8.2 - 8.5 %) (pooled OR 2.03, 95 % CI 1.48 - 2.79, p < 0.001). CONCLUSIONS This systematic review suggests that EMS-physician-guided CPR in out-of-hospital cardiac arrest is associated with improved survival outcomes.
Resumo:
Detecting lame cows is important in improving animal welfare. Automated tools are potentially useful to enable identification and monitoring of lame cows. The goals of this study were to evaluate the suitability of various physiological and behavioral parameters to automatically detect lameness in dairy cows housed in a cubicle barn. Lame cows suffering from a claw horn lesion (sole ulcer or white line disease) of one claw of the same hind limb (n=32; group L) and 10 nonlame healthy cows (group C) were included in this study. Lying and standing behavior at night by tridimensional accelerometers, weight distribution between hind limbs by the 4-scale weighing platform, feeding behavior at night by the nose band sensor, and heart activity by the Polar device (Polar Electro Oy, Kempele, Finland) were assessed. Either the entire data set or parts of the data collected over a 48-h period were used for statistical analysis, depending upon the parameter in question. The standing time at night over 12 h and the limb weight ratio (LWR) were significantly higher in group C as compared with group L, whereas the lying time at night over 12 h, the mean limb difference (△weight), and the standard deviation (SD) of the weight applied on the limb taking less weight were significantly lower in group C as compared with group L. No significant difference was noted between the groups for the parameters of heart activity and feeding behavior at night. The locomotion score of cows in group L was positively correlated with the lying time and △weight, whereas it was negatively correlated with LWR and SD. The highest sensitivity (0.97) for lameness detection was found for the parameter SD [specificity of 0.80 and an area under the curve (AUC) of 0.84]. The highest specificity (0.90) for lameness detection was present for Δweight (sensitivity=0.78; AUC=0.88) and LWR (sensitivity=0.81; AUC=0.87). The model considering the data of SD together with lying time at night was the best predictor of cows being lame, accounting for 40% of the variation in the likelihood of a cow being lame (sensitivity=0.94; specificity=0.80; AUC=0.86). In conclusion, the data derived from the 4-scale-weighing platform, either alone or combined with the lying time at night over 12 h, represent the most valuable parameters for automated identification of lame cows suffering from a claw horn lesion of one individual hind limb.
Resumo:
Aims. The main goal of this work is to study element ratios that are important for the formation of planets of different masses. Methods. We study potential correlations between the existence of planetary companions and the relative elemental abundances of their host stars. We use a large sample of FGK-type dwarf stars for which precise Mg, Si, and Fe abundances have been derived using HARPS high-resolution and high-quality data. Results. A first analysis of the data suggests that low-mass planet host stars show higher [Mg/Si] ratios, while giant planet hosts present [Mg/Si] that is lower than field stars. However, we found that the [Mg/Si] ratio significantly depends on metallicity through Galactic chemical evolution. After removing the Galactic evolution trend only the difference in the [Mg/Si] elemental ratio between low-mass planet hosts and non-hosts was present in a significant way. These results suggest that low-mass planets are more prevalent around stars with high [Mg/Si]. Conclusions. Our results demonstrate the importance of Galactic chemical evolution and indicate that it may play an important role in the planetary internal structure and composition. The results also show that abundance ratios may be a very relevant issue for our understanding of planet formation and evolution.
Resumo:
Currently there is no general method to study the impact of population admixture within families on the assumptions of random mating and consequently, Hardy-Weinberg equilibrium (HWE) and linkage equilibrium (LE) and on the inference obtained from traditional linkage analysis. ^ First, through simulation, the effect of admixture of two populations on the log of the odds (LOD) score was assessed, using Prostate Cancer as the typical disease model. Comparisons between simulated mixed and homogeneous families were performed. LOD scores under both models of admixture (within families and within a data set of homogeneous families) were closest to the homogeneous family scores of the population having the highest mixing proportion. Random sampling of families or ascertainment of families with disease affection status did not affect this observation, nor did the mode of inheritance (dominant/recessive) or sample size. ^ Second, after establishing the effect of admixture on the LOD score and inference for linkage, the presence of induced disequilibria by population admixture within families was studied and an adjustment procedure was developed. The adjustment did not force all disequilibria to disappear but because the families were adjusted for the population admixture, those replicates where the disequilibria exist are no longer affected by the disequilibria in terms of maximization for linkage. Furthermore, the adjustment was able to exclude uninformative families or families that had such a high departure from HWE and/or LE that their LOD scores were not reliable. ^ Together these observations imply that the presence of families of mixed population ancestry impacts linkage analysis in terms of the LOD score and the estimate of the recombination fraction. ^
Resumo:
Do siblings of centenarians tend to have longer life spans? To answer this question, life spans of 184 siblings for 42 centenarians have been evaluated. Two important questions have been addressed in analyzing the sibling data. First, a standard needs to be established, to which the life spans of 184 siblings are compared. In this report, an external reference population is constructed from the U.S. life tables. Its estimated mortality rates are treated as baseline hazards from which the relative mortality of the siblings are estimated. Second, the standard survival models which assume independent observations are invalid when correlation within family exists, underestimating the true variance. Methods that allow correlations are illustrated by three different methods. First, the cumulative relative excess mortality between siblings and their comparison group is calculated and used as an effective graphic tool, along with the Product Limit estimator of the survival function. The variance estimator of the cumulative relative excess mortality is adjusted for the potential within family correlation using Taylor linearization approach. Second, approaches that adjust for the inflated variance are examined. They are adjusted one-sample log-rank test using design effect originally proposed by Rao and Scott in the correlated binomial or Poisson distribution setting and the robust variance estimator derived from the log-likelihood function of a multiplicative model. Nether of these two approaches provide correlation estimate within families, but the comparison with the comparison with the standard remains valid under dependence. Last, using the frailty model concept, the multiplicative model, where the baseline hazards are known, is extended by adding a random frailty term that is based on the positive stable or the gamma distribution. Comparisons between the two frailty distributions are performed by simulation. Based on the results from various approaches, it is concluded that the siblings of centenarians had significant lower mortality rates as compared to their cohorts. The frailty models also indicate significant correlations between the life spans of the siblings. ^