68 resultados para Automated Hazard Analysis
Resumo:
IMPORTANCE Some experts suggest that serum thyrotropin levels in the upper part of the current reference range should be considered abnormal, an approach that would reclassify many individuals as having mild hypothyroidism. Health hazards associated with such thyrotropin levels are poorly documented, but conflicting evidence suggests that thyrotropin levels in the upper part of the reference range may be associated with an increased risk of coronary heart disease (CHD). OBJECTIVE To assess the association between differences in thyroid function within the reference range and CHD risk. DESIGN, SETTING, AND PARTICIPANTS Individual participant data analysis of 14 cohorts with baseline examinations between July 1972 and April 2002 and with median follow-up ranging from 3.3 to 20.0 years. Participants included 55,412 individuals with serum thyrotropin levels of 0.45 to 4.49 mIU/L and no previously known thyroid or cardiovascular disease at baseline. EXPOSURES Thyroid function as expressed by serum thyrotropin levels at baseline. MAIN OUTCOMES AND MEASURES Hazard ratios (HRs) of CHD mortality and CHD events according to thyrotropin levels after adjustment for age, sex, and smoking status. RESULTS Among 55,412 individuals, 1813 people (3.3%) died of CHD during 643,183 person-years of follow-up. In 10 cohorts with information on both nonfatal and fatal CHD events, 4666 of 48,875 individuals (9.5%) experienced a first-time CHD event during 533,408 person-years of follow-up. For each 1-mIU/L higher thyrotropin level, the HR was 0.97 (95% CI, 0.90-1.04) for CHD mortality and 1.00 (95% CI, 0.97-1.03) for a first-time CHD event. Similarly, in analyses by categories of thyrotropin, the HRs of CHD mortality (0.94 [95% CI, 0.74-1.20]) and CHD events (0.97 [95% CI, 0.83-1.13]) were similar among participants with the highest (3.50-4.49 mIU/L) compared with the lowest (0.45-1.49 mIU/L) thyrotropin levels. Subgroup analyses by sex and age group yielded similar results. CONCLUSIONS AND RELEVANCE Thyrotropin levels within the reference range are not associated with risk of CHD events or CHD mortality. This finding suggests that differences in thyroid function within the population reference range do not influence the risk of CHD. Increased CHD risk does not appear to be a reason for lowering the upper thyrotropin reference limit.
Resumo:
OBJECTIVE The objective was to determine the risk of stroke associated with subclinical hypothyroidism. DATA SOURCES AND STUDY SELECTION Published prospective cohort studies were identified through a systematic search through November 2013 without restrictions in several databases. Unpublished studies were identified through the Thyroid Studies Collaboration. We collected individual participant data on thyroid function and stroke outcome. Euthyroidism was defined as TSH levels of 0.45-4.49 mIU/L, and subclinical hypothyroidism was defined as TSH levels of 4.5-19.9 mIU/L with normal T4 levels. DATA EXTRACTION AND SYNTHESIS We collected individual participant data on 47 573 adults (3451 subclinical hypothyroidism) from 17 cohorts and followed up from 1972-2014 (489 192 person-years). Age- and sex-adjusted pooled hazard ratios (HRs) for participants with subclinical hypothyroidism compared to euthyroidism were 1.05 (95% confidence interval [CI], 0.91-1.21) for stroke events (combined fatal and nonfatal stroke) and 1.07 (95% CI, 0.80-1.42) for fatal stroke. Stratified by age, the HR for stroke events was 3.32 (95% CI, 1.25-8.80) for individuals aged 18-49 years. There was an increased risk of fatal stroke in the age groups 18-49 and 50-64 years, with a HR of 4.22 (95% CI, 1.08-16.55) and 2.86 (95% CI, 1.31-6.26), respectively (p trend 0.04). We found no increased risk for those 65-79 years old (HR, 1.00; 95% CI, 0.86-1.18) or ≥ 80 years old (HR, 1.31; 95% CI, 0.79-2.18). There was a pattern of increased risk of fatal stroke with higher TSH concentrations. CONCLUSIONS Although no overall effect of subclinical hypothyroidism on stroke could be demonstrated, an increased risk in subjects younger than 65 years and those with higher TSH concentrations was observed.
Resumo:
OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.
Resumo:
OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.
Resumo:
Detecting lame cows is important in improving animal welfare. Automated tools are potentially useful to enable identification and monitoring of lame cows. The goals of this study were to evaluate the suitability of various physiological and behavioral parameters to automatically detect lameness in dairy cows housed in a cubicle barn. Lame cows suffering from a claw horn lesion (sole ulcer or white line disease) of one claw of the same hind limb (n=32; group L) and 10 nonlame healthy cows (group C) were included in this study. Lying and standing behavior at night by tridimensional accelerometers, weight distribution between hind limbs by the 4-scale weighing platform, feeding behavior at night by the nose band sensor, and heart activity by the Polar device (Polar Electro Oy, Kempele, Finland) were assessed. Either the entire data set or parts of the data collected over a 48-h period were used for statistical analysis, depending upon the parameter in question. The standing time at night over 12 h and the limb weight ratio (LWR) were significantly higher in group C as compared with group L, whereas the lying time at night over 12 h, the mean limb difference (△weight), and the standard deviation (SD) of the weight applied on the limb taking less weight were significantly lower in group C as compared with group L. No significant difference was noted between the groups for the parameters of heart activity and feeding behavior at night. The locomotion score of cows in group L was positively correlated with the lying time and △weight, whereas it was negatively correlated with LWR and SD. The highest sensitivity (0.97) for lameness detection was found for the parameter SD [specificity of 0.80 and an area under the curve (AUC) of 0.84]. The highest specificity (0.90) for lameness detection was present for Δweight (sensitivity=0.78; AUC=0.88) and LWR (sensitivity=0.81; AUC=0.87). The model considering the data of SD together with lying time at night was the best predictor of cows being lame, accounting for 40% of the variation in the likelihood of a cow being lame (sensitivity=0.94; specificity=0.80; AUC=0.86). In conclusion, the data derived from the 4-scale-weighing platform, either alone or combined with the lying time at night over 12 h, represent the most valuable parameters for automated identification of lame cows suffering from a claw horn lesion of one individual hind limb.
Resumo:
This study was carried out to detect differences in locomotion and feeding behavior in lame (group L; n = 41; gait score ≥ 2.5) and non-lame (group C; n = 12; gait score ≤ 2) multiparous Holstein cows in a cross-sectional study design. A model for automatic lameness detection was created, using data from accelerometers attached to the hind limbs and noseband sensors attached to the head. Each cow's gait was videotaped and scored on a 5-point scale before and after a period of 3 consecutive days of behavioral data recording. The mean value of 3 independent experienced observers was taken as a definite gait score and considered to be the gold standard. For statistical analysis, data from the noseband sensor and one of two accelerometers per cow (randomly selected) of 2 out of 3 randomly selected days was used. For comparison between group L and group C, the T-test, the Aspin-Welch Test and the Wilcoxon Test were used. The sensitivity and specificity for lameness detection was determined with logistic regression and ROC-analysis. Group L compared to group C had significantly lower eating and ruminating time, fewer eating chews, ruminating chews and ruminating boluses, longer lying time and lying bout duration, lower standing time, fewer standing and walking bouts, fewer, slower and shorter strides and a lower walking speed. The model considering the number of standing bouts and walking speed was the best predictor of cows being lame with a sensitivity of 90.2% and specificity of 91.7%. Sensitivity and specificity of the lameness detection model were considered to be very high, even without the use of halter data. It was concluded that under the conditions of the study farm, accelerometer data were suitable for accurately distinguishing between lame and non-lame dairy cows, even in cases of slight lameness with a gait score of 2.5.