929 resultados para bandwidth 2.0 GHz to 2.45 GHz


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: As the long-term survival of pancreatic head malignancies remains dismal, efforts have been made for a better patient selection and a tailored treatment. Tumour size could also be used for patient stratification. METHODS: One hundred and fourteen patients underwent a pancreaticoduodenectomy for pancreatic adenocarcinoma, peri-ampullary and biliary cancer stratified according to: ≤20 mm, 21-34 mm, 35-45 mm and >45 mm tumour size. RESULTS: Patients with tumour sizes of ≤20 mm had a N1 rate of 41% and a R1/2 rate of 7%. The median survival was 3.4 years. N1 and R1/2 rates increased to 84% and 31% for tumour sizes of 21-34 mm (P = 0.0002 for N, P = 0.02 for R). The median survival decreased to 1.6 years (P = 0.0003). A further increase in tumour size of 35-45 mm revealed a further increase of N1 and R1/2 rates of 93% (P < 0.0001) and 33%, respectively. The median survival was 1.2 years (P = 0.004). Tumour sizes >45 mm were related to a further decreased median survival of 1.1 years (P = 0.2), whereas N1 and R1/2 rates were 87% and 20%, respectively. DISCUSSION: Tumour size is an important feature of pancreatic head malignancies. A tumour diameter of 20 mm seems to be the cut-off above which an increased rate of incomplete resections and metastatic lymph nodes must be encountered and the median survival is reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim:Isolated limb perfusion (ILP) is a technique consisting in administrating doses of chemotherapy up to 20 times higher than via systemic route in a limb affected by melanoma or sarcoma to maximise tumour reduction. ILP is performed in <50 centres worldwide and leads to partial or complete response, however without effect on overall survival. As an alternative to amputation, it improves patient quality of life. We report our >10-year single centre experience on the role of nuclear medicine in ILP. Material and method:From 2000 to 2012, we performed 77 ILP (45 women, 32 men; aged 62±16 years) for 49 melanoma (64%), 25 sarcoma (32%) and 3 others tumors (2 desmoid tumours and 1 aggressive fibromatosis) (3%). The affected limb vascularisation is isolated from the systemic circulation (SYS) using extracorporeal circulation, and chemotherapy (usually TNF and Melphalan) is administered. Peroperatively, limb isolation and eventual leakage from ILP to SYS are monitored by continuous measurement using a gamma-probe placed over the heart (150MBq of 99mTc-human serum albumin in ILP and 4MBq in SYS). The maximum acceptable leakage to the systemic circulation is 10% (maximum tolerated systemic TNF dose). Results:In total, 47 patients (61%) had positive leaks from the ILP to SYS of 4.1±14.5% (median 1% interquartile range 0.4% to 3.2%, range 0 to 100%) and 30 patients (39%) had negative leaks from the SYS to ILP of -0.9±1.2% (median -0.5%, interquartile range -0.8% to -0.2%, range -4.8% to -0.1%). In only 2 patients (2.6%), leaks >10% were observed leading to interrupting ILP. Conclusion:Nuclear Medicine has a crucial role for the safety and quality of ILP in monitoring leakage peroperatively and help deciding whether the procedure should be interrupted to minimize systemic toxicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atrial arrhythmias (AAs) are a common complication in adult patients with congenital heart disease. We sought to compare the lifetime prevalence of AAs in patients with right- versus left-sided congenital cardiac lesions and their effect on the prognosis. A congenital heart disease diagnosis was assigned using the International Disease Classification, Ninth Revision, diagnostic codes in the administrative databases of Quebec, from 1983 to 2005. Patients with AAs were those diagnosed with an International Disease Classification, Ninth Revision, code for atrial fibrillation or intra-atrial reentry tachycardia. To ensure that the diagnosis of AA was new, a washout period of 5 years after entry into the database was used, a period during which the patient could not have received an International Disease Classification, Ninth Revision, code for AA. The cumulative lifetime risk of AA was estimated using the Practical Incidence Estimators method. The hazard ratios (HRs) for mortality, morbidity, and cardiac interventions were compared between those with right- and left-sided lesions after adjustment for age, gender, disease severity, and cardiac risk factors. In a population of 71,467 patients, 7,756 adults developed AAs (isolated right-sided, 2,229; isolated left-sided, 1,725). The lifetime risk of developing AAs was significantly greater in patients with right- sided than in patients with left-sided lesions (61.0% vs 55.4%, p <0.001). The HR for mortality and the development of stroke or heart failure was similar in both groups (HR 0.96, 95% confidence interval [CI] 0.86 to 1.09; HR 0.94, 95% CI 0.80 to 1.09; and HR 1.10, 95% CI 0.98 to 1.23, respectively). However, the rates of cardiac catheterization (HR 0.63, 95% CI 0.55 to 0.72), cardiac surgery (HR 0.40, 95% CI 0.36 to 0.45), and arrhythmia surgery (HR 0.77, 95% CI 0.6 to 0.98) were significantly less for patients with right-sided lesions. In conclusion, patients with right-sided lesions had a greater lifetime burden of AAs. However, their morbidity and mortality were no less than those with left-sided lesions, although the rate of intervention was substantially different.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to determine the sensitivity of maize (Zea mays) genotypes to water deficit, using a simple agrometeorological crop yield model. Crop actual yield and agronomic data of 26 genotypes were obtained from the Maize National Assays carried out in ten locations, in four Brazilian states, from 1998 to 2006. Weather information for each experimental location and period were obtained from the closest weather station. Water deficit sensitivity index (Ky) was determined using the crop yield depletion model. Genotypes can be divided into two groups according to their resistance to water deficit. Normal resistance genotypes had Ky ranging from 0.4 to 0.5 in vegetative period, 1.4 to 1.5 in flowering, 0.3 to 0.6 in fruiting, and 0.1 to 0.3 in maturing period, whereas the higher resistance genotypes had lower values, respectively 0.2-0.4, 0.7-1.2, 0.2-0.4, and 0.1-0.2. The general Ky for the total growing season was 2.15 for sensitive genotypes and 1.56 for the resistant ones. Model performance was acceptable to evaluate crop actual yield, whose average errors estimated for each genotype ranged from -5.7% to +5.8%, and whose general mean absolute error was 960 kg ha-1 (10%).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Up to 5% of patients presenting to the emergency department (ED) four or more times within a 12 month period represent 21% of total ED visits. In this study we sought to characterize social and medical vulnerability factors of ED frequent users (FUs) and to explore if these factors hold simultaneously. METHODS: We performed a case-control study at Lausanne University Hospital, Switzerland. Patients over 18 years presenting to the ED at least once within the study period (April 2008 toMarch 2009) were included. FUs were defined as patients with four or more ED visits within the previous 12 months. Outcome data were extracted from medical records of the first ED attendance within the study period. Outcomes included basic demographics and social variables, ED admission diagnosis, somatic and psychiatric days hospitalized over 12 months, and having a primary care physician.We calculated the percentage of FUs and non-FUs having at least one social and one medical vulnerability factor. The four chosen social factors included: unemployed and/or dependence on government welfare, institutionalized and/or without fixed residence, either separated, divorced or widowed, and under guardianship. The fourmedical vulnerability factors were: ≥6 somatic days hospitalized, ≥1 psychiatric days hospitalized, ≥5 clinical departments used (all three factors measured over 12 months), and ED admission diagnosis of alcohol and/or drug abuse. Univariate and multivariate logistical regression analyses allowed comparison of two JGIM ABSTRACTS S391 random samples of 354 FUs and 354 non-FUs (statistical power 0.9, alpha 0.05 for all outcomes except gender, country of birth, and insurance type). RESULTS: FUs accounted for 7.7% of ED patients and 24.9% of ED visits. Univariate logistic regression showed that FUs were older (mean age 49.8 vs. 45.2 yrs, p=0.003),more often separated and/or divorced (17.5%vs. 13.9%, p=0.029) or widowed (13.8% vs. 8.8%, p=0.029), and either unemployed or dependent on government welfare (31.3% vs. 13.3%, p<0.001), compared to non-FUs. FUs cumulated more days hospitalized over 12 months (mean number of somatic days per patient 1.0 vs. 0.3, p<0.001; mean number of psychiatric days per patient 0.12 vs. 0.03, p<0.001). The two groups were similar regarding gender distribution (females 51.7% vs. 48.3%). The multivariate linear regression model was based on the six most significant factors identified by univariate analysis The model showed that FUs had more social problems, as they were more likely to be institutionalized or not have a fixed residence (OR 4.62; 95% CI, 1.65 to 12.93), and to be unemployed or dependent on government welfare (OR 2.03; 95% CI, 1.31 to 3.14) compared to non-FUs. FUs were more likely to need medical care, as indicated by involvement of≥5 clinical departments over 12 months (OR 6.2; 95%CI, 3.74 to 10.15), having an ED admission diagnosis of substance abuse (OR 3.23; 95% CI, 1.23 to 8.46) and having a primary care physician (OR 1.70;95%CI, 1.13 to 2.56); however, they were less likely to present with an admission diagnosis of injury (OR 0.64; 95% CI, 0.40 to 1.00) compared to non-FUs. FUs were more likely to combine at least one social with one medical vulnerability factor (38.4% vs. 12.1%, OR 7.74; 95% CI 5.03 to 11.93). CONCLUSIONS: FUs were more likely than non-FUs to have social and medical vulnerability factors and to have multiple factors in combination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Intravenous thrombolysis (IVT) for stroke seems to be beneficial independent of the underlying etiology. Whether this is also true for cervical artery dissection (CAD) is addressed in this study.METHODS: We used the Swiss IVT databank to compare outcome and complications of IVT-treated patients with CAD with IVT-treated patients with other etiologies (non-CAD patients). Main outcome and complication measures were favorable 3-month outcome, intracranial cerebral hemorrhage, and recurrent ischemic stroke. Modified Rankin Scale score <or=1 at 3 months was considered favorable.RESULTS: Fifty-five (5.2%) of 1062 IVT-treated patients had CAD. Patients with CAD were younger (median age 50 versus 70 years) but had similar median National Institutes of Health Stroke Scale scores (14 versus 13) and time to treatment (152.5 versus 156 minutes) as non-CAD patients. In the CAD group, 36% (20 of 55) had a favorable 3-month outcome compared with 44% (447 of 1007) non-CAD patients (OR, 0.72; 95% CI, 0.41 to 1.26), which was less favorable after adjustment for age, gender, and National Institutes of Health Stroke Scale score (OR, 0.50; 95% CI, 0.27 to 0.95; P=0.03). Intracranial cerebral hemorrhages (asymptomatic, symptomatic, fatal) were equally frequent in CAD (14% [7%, 7%, 2%]) and non-CAD patients (14% [9%, 5%, 2%]; P=0.99). Recurrent ischemic stroke occurred in 1.8% of patients with CAD and in 3.7% of non-CAD-patients (P=0.71).CONCLUSIONS: IVT-treated patients with CAD do not recover as well as IVT-treated non-CAD patients. However, intracranial bleedings and recurrent ischemic strokes were equally frequent in both groups. They do not account for different outcomes and indicate that IVT should not be excluded in patients who may have CAD. Hemodynamic compromise or frequent tandem occlusions might explain the less favorable outcome of patients with CAD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to compare random regression models for the estimation of genetic parameters for Guzerat milk production, using orthogonal Legendre polynomials. Records (20,524) of test-day milk yield (TDMY) from 2,816 first-lactation Guzerat cows were used. TDMY grouped into 10-monthly classes were analyzed for additive genetic effect and for environmental and residual permanent effects (random effects), whereas the contemporary group, calving age (linear and quadratic effects) and mean lactation curve were analized as fixed effects. Trajectories for the additive genetic and permanent environmental effects were modeled by means of a covariance function employing orthogonal Legendre polynomials ranging from the second to the fifth order. Residual variances were considered in one, four, six, or ten variance classes. The best model had six residual variance classes. The heritability estimates for the TDMY records varied from 0.19 to 0.32. The random regression model that used a second-order Legendre polynomial for the additive genetic effect, and a fifth-order polynomial for the permanent environmental effect is adequate for comparison by the main employed criteria. The model with a second-order Legendre polynomial for the additive genetic effect, and that with a fourth-order for the permanent environmental effect could also be employed in these analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Workers with persistent disabilities after orthopaedic trauma may need occupational rehabilitation. Despite various risk profiles for non-return-to-work (non-RTW), there is no available predictive model. Moreover, injured workers may have various origins (immigrant workers), which may either affect their return to work or their eligibility for research purposes. The aim of this study was to develop and validate a predictive model that estimates the likelihood of non-RTW after occupational rehabilitation using predictors which do not rely on the worker's background. METHODS: Prospective cohort study (3177 participants, native (51%) and immigrant workers (49%)) with two samples: a) Development sample with patients from 2004 to 2007 with Full and Reduced Models, b) External validation of the Reduced Model with patients from 2008 to March 2010. We collected patients' data and biopsychosocial complexity with an observer rated interview (INTERMED). Non-RTW was assessed two years after discharge from the rehabilitation. Discrimination was assessed by the area under the receiver operating curve (AUC) and calibration was evaluated with a calibration plot. The model was reduced with random forests. RESULTS: At 2 years, the non-RTW status was known for 2462 patients (77.5% of the total sample). The prevalence of non-RTW was 50%. The full model (36 items) and the reduced model (19 items) had acceptable discrimination performance (AUC 0.75, 95% CI 0.72 to 0.78 and 0.74, 95% CI 0.71 to 0.76, respectively) and good calibration. For the validation model, the discrimination performance was acceptable (AUC 0.73; 95% CI 0.70 to 0.77) and calibration was also adequate. CONCLUSIONS: Non-RTW may be predicted with a simple model constructed with variables independent of the patient's education and language fluency. This model is useful for all kinds of trauma in order to adjust for case mix and it is applicable to vulnerable populations like immigrant workers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT:¦BACKGROUND: The Spiritual Distress Assessment Tool (SDAT) is a 5-item instrument developed to assess unmet spiritual needs in hospitalized elderly patients and to determine the presence of spiritual distress. The objective of this study was to investigate the SDAT psychometric properties.¦METHODS: This cross-sectional study was performed in a Geriatric Rehabilitation Unit. Patients (N = 203), aged 65 years and over with Mini Mental State Exam score ≥ 20, were consecutively enrolled over a 6-month period. Data on health, functional, cognitive, affective and spiritual status were collected upon admission. Interviews using the SDAT (score from 0 to 15, higher scores indicating higher distress) were conducted by a trained chaplain. Factor analysis, measures of internal consistency (inter-item and item-to-total correlations, Cronbach α), and reliability (intra-rater and inter-rater) were performed. Criterion-related validity was assessed using the Functional Assessment of Chronic Illness Therapy-Spiritual well-being (FACIT-Sp) and the question "Are you at peace?" as criterion-standard. Concurrent and predictive validity were assessed using the Geriatric Depression Scale (GDS), occurrence of a family meeting, hospital length of stay (LOS) and destination at discharge.¦RESULTS: SDAT scores ranged from 1 to 11 (mean 5.6 ± 2.4). Overall, 65.0% (132/203) of the patients reported some spiritual distress on SDAT total score and 22.2% (45/203) reported at least one severe unmet spiritual need. A two-factor solution explained 60% of the variance. Inter-item correlations ranged from 0.11 to 0.41 (eight out of ten with P < 0.05). Item-to-total correlations ranged from 0.57 to 0.66 (all P < 0.001). Cronbach α was acceptable (0.60). Intra-rater and inter-rater reliabilities were high (Intraclass Correlation Coefficients ranging from 0.87 to 0.96). SDAT correlated significantly with the FACIT-Sp, "Are you at peace?", GDS (Rho -0.45, -0.33, and 0.43, respectively, all P < .001), and LOS (Rho 0.15, P = .03). Compared with patients showing no severely unmet spiritual need, patients with at least one severe unmet spiritual need had higher odds of occurrence of a family meeting (adjOR 4.7, 95%CI 1.4-16.3, P = .02) and were more often discharged to a nursing home (13.3% vs 3.8%; P = .027).¦CONCLUSIONS: SDAT has acceptable psychometrics properties and appears to be a valid and reliable instrument to assess spiritual distress in elderly hospitalized patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE This prospective multicenter phase III study compared the efficacy and safety of a triple combination (bortezomib-thalidomide-dexamethasone [VTD]) versus a dual combination (thalidomide-dexamethasone [TD]) in patients with multiple myeloma (MM) progressing or relapsing after autologous stem-cell transplantation (ASCT). PATIENTS AND METHODS Overall, 269 patients were randomly assigned to receive bortezomib (1.3 mg/m(2) intravenous bolus) or no bortezomib for 1 year, in combination with thalidomide (200 mg per day orally) and dexamethasone (40 mg orally once a day on 4 days once every 3 weeks). Bortezomib was administered on days 1, 4, 8, and 11 with a 10-day rest period (day 12 to day 21) for eight cycles (6 months), and then on days 1, 8, 15, and 22 with a 20-day rest period (day 23 to day 42) for four cycles (6 months). Results Median time to progression (primary end point) was significantly longer with VTD than TD (19.5 v 13.8 months; hazard ratio, 0.59; 95% CI, 0.44 to 0.80; P = .001), the complete response plus near-complete response rate was higher (45% v 25%; P = .001), and the median duration of response was longer (17.2 v 13.4 months; P = .03). The 24-month survival rate was in favor of VTD (71% v 65%; P = .093). Grade 3 peripheral neuropathy was more frequent with VTD (29% v 12%; P = .001) as were the rates of grades 3 and 4 infection and thrombocytopenia. CONCLUSION VTD was more effective than TD in the treatment of patients with MM with progressive or relapsing disease post-ASCT but was associated with a higher incidence of grade 3 neurotoxicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the electrical industry the 50 Hz electric and magnetic fields are often higher than in the average working environment. The electric and magnetic fields can be studied by measuring or by calculatingthe fields in the environment. For example, the electric field under a 400 kV power line is 1 to 10 kV/m, and the magnetic flux density is 1 to 15 µT. Electricand magnetic fields of a power line induce a weak electric field and electric currents in the exposed body. The average current density in a human being standing under a 400 kV line is 1 to 2 mA/m2. The aim of this study is to find out thepossible effects of short term exposure to electric and magnetic fields of electricity power transmission on workers' health, in particular the cardiovascular effects. The study consists of two parts; Experiment I: influence on extrasystoles, and Experiment II: influence on heart rate. In Experiment I two groups, 26 voluntary men (Group 1) and 27 transmission-line workers (Group 2), were measured. Their electrocardiogram (ECG) was recorded with an ambulatory recorder both in and outside the field. In Group 1 the fields were 1.7 to 4.9 kV/m and 1.1 to 7.1 pT; in Group 2 they were 0.1 to 10.2 kV/m and 1.0 to 15.4 pT. In the ECG analysis the only significant observation was a decrease in the heart rate after field exposure (Group 1). The drop cannot be explained with the first measuring method. Therefore Experiment II was carried out. In Experiment II two groups were used; Group 1 (26 male volunteers) were measured in real field exposure, Group 2 (15 male volunteers) in "sham" fields. The subjects of Group 1 spent 1 h outside the field, then 1 h in the field under a 400 kV transmission line, and then again 1 h outside the field. Under the 400 kV linethe field strength varied from 3.5 to 4.3 kV/m, and from 1.4 to 6.6 pT. Group 2spent the entire test period (3 h) in a 33 kV outdoor testing station in a "sham" field. ECG, blood pressure, and electroencephalogram (EEG) were measured by ambulatory methods. Before and after the field exposure, the subjects performed some cardiovascular autonomic function tests. The analysis of the results (Experiments I and II) showed that extrasystoles or arrythmias were as frequent in the field (below 4 kV/m and 4 pT) as outside it. In Experiment II there was no decrease detected in the heart rate, and the systolic and diastolic blood pressure stayed nearly the same. No health effects were found in this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Chronic HCV infection is a leading cause of liver-related morbidity globally. The innate and adaptive immune responses are thought to be important in determining viral outcomes. Polymorphisms associated with the IFNL3 (IL28B) gene are strongly associated with spontaneous clearance and treatment outcomes. OBJECTIVE: This study investigates the importance of HLA genes in the context of genetic variation associated with the innate immune genes IFNL3 and KIR2DS3. DESIGN: We assess the collective influence of HLA and innate immune genes on viral outcomes in an Irish cohort of women (n=319) who had been infected from a single source as well as a more heterogeneous cohort (Swiss Cohort, n=461). In the Irish cohort, a number of HLA alleles are associated with different outcomes, and the impact of IFNL3-linked polymorphisms is profound. RESULTS: Logistic regression was performed on data from the Irish cohort, and indicates that the HLA-A*03 (OR 0.36 (0.15 to 0.89), p=0.027) -B*27 (OR 0.12 (0.03 to 0.45), p=<0.001), -DRB1*01:01 (OR 0.2 (0.07 to 0.61), p=0.005), -DRB1*04:01 (OR 0.31 (0.12 to 0.85, p=0.02) and the CC IFNL3 rs12979860 genotypes (OR 0.1 (0.04 to 0.23), p<0.001) are significantly associated with viral clearance. Furthermore, DQB1*02:01 (OR 4.2 (2.04 to 8.66), p=0.008), KIR2DS3 (OR 4.36 (1.62 to 11.74), p=0.004) and the rs12979860 IFNL3 'T' allele are associated with chronic infection. This study finds no interactive effect between IFNL3 and these Class I and II alleles in relation to viral clearance. There is a clear additive effect, however. Data from the Swiss cohort also confirms independent and additive effects of HLA Class I, II and IFNL3 genes in their prediction of viral outcome. CONCLUSIONS: This data supports a critical role for the adaptive immune response in the control of HCV in concert with the innate immune response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fruits of five regional selections of rambutan (Nephelium lappaceum L.) were characterized to identify those with international marketing quality to promote their propagation in Mexico, improvement and conservation in germoplasm bank. The fruits were harvested in June, July, and August 2008 and, after each harvest, were assessed for shape (length/diameter), firmness, fruit weight, number of fruits per kilogram, weight and percentage of pericarp, seed and aril, total soluble solids, total sugars, vitamin C content, pH, and titratable acidity. In addition, a sensorial evaluation was carried out with 31 panelists who graded each selection for color, sweetness, and acidity. Fruits of five selections were ovoid, and with the following characteristics: firmness values from 43.7 to 51.0 N, fruit weight ranged from 22.4 to 34.7 g, registering from 28.9 to 45.0 fruits per kg; pericarp weight from 10.5 to 17.3 g (45.9 to 49.9% of the total fruit weight); total seed weight from 2.2 to 2.5 g (7.0 to 10.0%); average arils weight from 8.9 to 13.1 g (37.5 to 41.4%). The fruits had high contents of total soluble solids (17.8 to 20.4 ºBrix), total sugars (211.95 to 242.70 mg/100g in the edible portion), vitamin C (37.9 to 69.1 mg/100 g), pH 5.0, and titratable acidity of 0.20 to 0.28%. The fruits from the RT-01 and RT-05 selections had better attributes in fruit weight, total soluble solids and titratable acidity and were better accepted by the panelists. Harvest date significantly affects rambutan fruit quality; at the middle and end of the season harvested fruits had better qualitative characteristics for the marketing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Vorapaxar is a new oral protease-activated-receptor 1 (PAR-1) antagonist that inhibits thrombin-induced platelet activation. Methods: In this multinational, double-blind, randomized trial, we compared vorapaxar with placebo in 12,944 patients who had acute coronary syndromes without ST-segment elevation. The primary end point was a composite of death from cardiovascular causes, myocardial infarction, stroke, recurrent ischemia with rehospitalization, or urgent coronary revascularization. RESULTS: Follow-up in the trial was terminated early after a safety review. After a median follow-up of 502 days (interquartile range, 349 to 667), the primary end point occurred in 1031 of 6473 patients receiving vorapaxar versus 1102 of 6471 patients receiving placebo (Kaplan-Meier 2-year rate, 18.5% vs. 19.9%; hazard ratio, 0.92; 95% confidence interval [CI], 0.85 to 1.01; P = 0.07). A composite of death from cardiovascular causes, myocardial infarction, or stroke occurred in 822 patients in the vorapaxar group versus 910 in the placebo group (14.7% and 16.4%, respectively; hazard ratio, 0.89; 95% CI, 0.81 to 0.98; P = 0.02). Rates of moderate and severe bleeding were 7.2% in the vorapaxar group and 5.2% in the placebo group (hazard ratio, 1.35; 95% CI, 1.16 to 1.58; P<0.001). Intracranial hemorrhage rates were 1.1% and 0.2%, respectively (hazard ratio, 3.39; 95% CI, 1.78 to 6.45; P<0.001). Rates of nonhemorrhagic adverse events were similar in the two groups. Conclusions: In patients with acute coronary syndromes, the addition of vorapaxar to standard therapy did not significantly reduce the primary composite end point but significantly increased the risk of major bleeding, including intracranial hemorrhage. (Funded by Merck; TRACER ClinicalTrials.gov number, NCT00527943.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IMPORTANCE: Associations between subclinical thyroid dysfunction and fractures are unclear and clinical trials are lacking. OBJECTIVE: To assess the association of subclinical thyroid dysfunction with hip, nonspine, spine, or any fractures. DATA SOURCES AND STUDY SELECTION: The databases of MEDLINE and EMBASE (inception to March 26, 2015) were searched without language restrictions for prospective cohort studies with thyroid function data and subsequent fractures. DATA EXTRACTION: Individual participant data were obtained from 13 prospective cohorts in the United States, Europe, Australia, and Japan. Levels of thyroid function were defined as euthyroidism (thyroid-stimulating hormone [TSH], 0.45-4.49 mIU/L), subclinical hyperthyroidism (TSH <0.45 mIU/L), and subclinical hypothyroidism (TSH ≥4.50-19.99 mIU/L) with normal thyroxine concentrations. MAIN OUTCOME AND MEASURES: The primary outcome was hip fracture. Any fractures, nonspine fractures, and clinical spine fractures were secondary outcomes. RESULTS: Among 70,298 participants, 4092 (5.8%) had subclinical hypothyroidism and 2219 (3.2%) had subclinical hyperthyroidism. During 762,401 person-years of follow-up, hip fracture occurred in 2975 participants (4.6%; 12 studies), any fracture in 2528 participants (9.0%; 8 studies), nonspine fracture in 2018 participants (8.4%; 8 studies), and spine fracture in 296 participants (1.3%; 6 studies). In age- and sex-adjusted analyses, the hazard ratio (HR) for subclinical hyperthyroidism vs euthyroidism was 1.36 for hip fracture (95% CI, 1.13-1.64; 146 events in 2082 participants vs 2534 in 56,471); for any fracture, HR was 1.28 (95% CI, 1.06-1.53; 121 events in 888 participants vs 2203 in 25,901); for nonspine fracture, HR was 1.16 (95% CI, 0.95-1.41; 107 events in 946 participants vs 1745 in 21,722); and for spine fracture, HR was 1.51 (95% CI, 0.93-2.45; 17 events in 732 participants vs 255 in 20,328). Lower TSH was associated with higher fracture rates: for TSH of less than 0.10 mIU/L, HR was 1.61 for hip fracture (95% CI, 1.21-2.15; 47 events in 510 participants); for any fracture, HR was 1.98 (95% CI, 1.41-2.78; 44 events in 212 participants); for nonspine fracture, HR was 1.61 (95% CI, 0.96-2.71; 32 events in 185 participants); and for spine fracture, HR was 3.57 (95% CI, 1.88-6.78; 8 events in 162 participants). Risks were similar after adjustment for other fracture risk factors. Endogenous subclinical hyperthyroidism (excluding thyroid medication users) was associated with HRs of 1.52 (95% CI, 1.19-1.93) for hip fracture, 1.42 (95% CI, 1.16-1.74) for any fracture, and 1.74 (95% CI, 1.01-2.99) for spine fracture. No association was found between subclinical hypothyroidism and fracture risk. CONCLUSIONS AND RELEVANCE: Subclinical hyperthyroidism was associated with an increased risk of hip and other fractures, particularly among those with TSH levels of less than 0.10 mIU/L and those with endogenous subclinical hyperthyroidism. Further study is needed to determine whether treating subclinical hyperthyroidism can prevent fractures.