315 resultados para fracture failure
Resumo:
OBJECTIVE: To determine whether differences in short-term virologic failure among commonly used antiretroviral therapy (ART) regimens translate to differences in clinical events in antiretroviral-naïve patients initiating ART. DESIGN: Observational cohort study of patients initiating ART between January 2000 and December 2005. SETTING: The Antiretroviral Therapy Cohort Collaboration (ART-CC) is a collaboration of 15 HIV cohort studies from Canada, Europe, and the United States. STUDY PARTICIPANTS: A total of 13 546 antiretroviral-naïve HIV-positive patients initiating ART with efavirenz, nevirapine, lopinavir/ritonavir, nelfinavir, or abacavir as third drugs in combination with a zidovudine and lamivudine nucleoside reverse transcriptase inhibitor backbone. MAIN OUTCOME MEASURES: Short-term (24-week) virologic failure (>500 copies/ml) and clinical events within 2 years of ART initiation (incident AIDS-defining event, death, and a composite measure of these two outcomes). RESULTS: Compared with efavirenz as initial third drug, short-term virologic failure was more common with all other third drugs evaluated; nevirapine (adjusted odds ratio = 1.87, 95% confidence interval (CI) = 1.58-2.22), lopinavir/ritonavir (1.32, 95% CI = 1.12-1.57), nelfinavir (3.20, 95% CI = 2.74-3.74), and abacavir (2.13, 95% CI = 1.82-2.50). However, the rate of clinical events within 2 years of ART initiation appeared higher only with nevirapine (adjusted hazard ratio for composite outcome measure 1.27, 95% CI = 1.04-1.56) and abacavir (1.22, 95% CI = 1.00-1.48). CONCLUSION: Among antiretroviral-naïve patients initiating therapy, between-ART regimen, differences in short-term virologic failure do not necessarily translate to differences in clinical outcomes. Our results should be interpreted with caution because of the possibility of residual confounding by indication.
Resumo:
Cardiovascular failure and low flow states may arise in very different conditions from both cardiac and noncardiac causes. Systemic hemodynamic failure inevitably alters splanchnic blood flow but in an unpredictable way. Prolonged low splanchnic blood flow causes intestinal ischemia, increased mucosal permeability, endotoxemia, and distant organ failure. Mortality associated with intestinal ischemia is high. Why would enteral nutrition (EN) be desirable in these complex patients when parenteral nutrition could easily cover energy and substrate requirements? Metabolic, immune, and practical reasons justify the use of EN. In addition, continuous enteral feeding minimizes systemic and myocardial oxygen consumption in patients with congestive heart failure. Further, early feeding in critically ill mechanically ventilated patients has been shown to reduce mortality, particularly in the sickest patients. In a series of cardiac surgery patients with compromised hemodynamics, absorption has been maintained, and 1000-1200 kcal/d could be delivered by enteral feeding. Therefore, early EN in stabilized patients should be attempted, and can be carried out safely under close clinical monitoring, looking for signs of incipient intestinal ischemia. Energy delivery and balance should be monitored, and combined feeding considered when enteral feeds cannot be advanced to target within 4-6 days.
Resumo:
BACKGROUND: The impact of abnormal spirometric findings on risk for incident heart failure among older adults without clinically apparent lung disease is not well elucidated.METHODS: We evaluated the association of baseline lung function with incident heart failure, defined as first hospitalization for heart failure, in 2125 participants of the community-based Health, Aging, and Body Composition (Health ABC) Study (age, 73.6 +/- 2.9 years; 50.5% men; 62.3% white; 37.7% black) without prevalent lung disease or heart failure. Abnormal lung function was defined either as forced vital capacity (FVC) or forced expiratory volume in 1(st) second (FEV1) to FVC ratio below lower limit of normal. Percent predicted FVC and FEV1 also were assessed as continuous variables.RESULTS: During follow-up (median, 9.4 years), heart failure developed in 68 of 350 (19.4%) participants with abnormal baseline lung function, as compared with 172 of 1775 (9.7%) participants with normal lung function (hazard ratio [HR] 2.31; 95% confidence interval [CI], 1.74-3.07; P <.001). This increased risk persisted after adjusting for previously identified heart failure risk factors in the Health ABC Study, body mass index, incident coronary heart disease, and inflammatory markers (HR 1.83; 95% CI, 1.33-2.50; P <.001). Percent predicted (%) FVC and FEV 1 had a linear association with heart failure risk (HR 1.21; 95% CI, 1.11-1.32 and 1.18; 95% CI, 1.10-1.26, per 10% lower % FVC and % FEV1, respectively; both P <.001 in fully adjusted models). Findings were consistent in sex and race subgroups and for heart failure with preserved or reduced ejection fraction.CONCLUSIONS: Abnormal spirometric findings in older adults without clinical lung disease are associated with increased heart failure risk. (C) 2011 Elsevier Inc. All rights reserved. The American Journal of Medicine (2011) 124, 334-341
Resumo:
ABSTRACT: BACKGROUND: Fractures associated with bone fragility in older adults signal the potential for secondary fracture. Fragility fractures often precipitate further decline in health and loss of mobility, with high associated costs for patients, families, society and the healthcare system. Promptly initiating a coordinated, comprehensive pharmacological bone health and falls prevention program post-fracture may improve osteoporosis treatment compliance; and reduce rates of falls and secondary fractures, and associated morbidity, mortality and costs.Methods/design: This pragmatic, controlled trial at 11 hospital sites in eight regions in Quebec, Canada, will recruit community-dwelling patients over age 50 who have sustained a fragility fracture to an intervention coordinated program or to standard care, according to the site. Site study coordinators will identify and recruit 1,596 participants for each study arm. Coordinators at intervention sites will facilitate continuity of care for bone health, and arrange fall prevention programs including physical exercise. The intervention teams include medical bone specialists, primary care physicians, pharmacists, nurses, rehabilitation clinicians, and community program organizers.The primary outcome of this study is the incidence of secondary fragility fractures within an 18-month follow-up period. Secondary outcomes include initiation and compliance with bone health medication; time to first fall and number of clinically significant falls; fall-related hospitalization and mortality; physical activity; quality of life; fragility fracture-related costs; admission to a long term care facility; participants' perceptions of care integration, expectations and satisfaction with the program; and participants' compliance with the fall prevention program. Finally, professionals at intervention sites will participate in focus groups to identify barriers and facilitating factors for the integrated fragility fracture prevention program.This integrated program will facilitate knowledge translation and dissemination via the following: involvement of various collaborators during the development and set-up of the integrated program; distribution of pamphlets about osteoporosis and fall prevention strategies to primary care physicians in the intervention group and patients in the control group; participation in evaluation activities; and eventual dissemination of study results.Study/trial registration: Clinical Trial.Gov NCT01745068Study ID number: CIHR grant # 267395.
Resumo:
Bone mineral density (BMD) measured by dual-energy X-ray absorptiometry (DXA) is used to diagnose osteoporosis and assess fracture risk. However, DXA cannot evaluate trabecular microarchitecture. This study used a novel software program (TBS iNsight; Med-Imaps, Geneva, Switzerland) to estimate bone texture (trabecular bone score [TBS]) from standard spine DXA images. We hypothesized that TBS assessment would differentiate women with low trauma fracture from those without. In this study, TBS was performed blinded to fracture status on existing research DXA lumbar spine (LS) images from 429 women. Mean participant age was 71.3 yr, and 158 had prior fractures. The correlation between LS BMD and TBS was low (r = 0.28), suggesting these parameters reflect different bone properties. Age- and body mass index-adjusted odds ratios (ORs) ranged from 1.36 to 1.63 for LS or hip BMD in discriminating women with low trauma nonvertebral and vertebral fractures. TBS demonstrated ORs from 2.46 to 2.49 for these respective fractures; these remained significant after lowest BMD T-score adjustment (OR = 2.38 and 2.44). Seventy-three percent of all fractures occurred in women without osteoporosis (BMD T-score > -2.5); 72% of these women had a TBS score below the median, thereby appropriately classified them as being at increased risk. In conclusion, TBS assessment enhances DXA by evaluating trabecular pattern and identifying individuals with vertebral or low trauma fracture. TBS identifies 66-70% of women with fracture who were not classified with osteoporosis by BMD alone.
Resumo:
BACKGROUND: Recent data suggest that beta-blockers can be beneficial in subgroups of patients with chronic heart failure (CHF). For metoprolol and carvedilol, an increase in ejection fraction has been shown and favorable effects on the myocardial remodeling process have been reported in some studies. We examined the effects of bisoprolol fumarate on exercise capacity and left ventricular volume with magnetic resonance imaging (MRI) and applied a novel high-resolution MRI tagging technique to determine myocardial rotation and relaxation velocity. METHODS: Twenty-eight patients (mean age, 57 +/- 11 years; mean ejection fraction, 26 +/- 6%) were randomized to bisoprolol fumarate (n = 13) or to placebo therapy (n = 15). The dosage of the drugs was titrated to match that of the the Cardiac Insufficiency Bisoprolol Study protocol. Hemodynamic and gas exchange responses to exercise, MRI measurements of left ventricular end-systolic and end-diastolic volumes and ejection fraction, and left ventricular rotation and relaxation velocities were measured before the administration of the drug and 6 and 12 months later. RESULTS: After 1 year, heart rate was reduced in the bisoprolol fumarate group both at rest (81 +/- 12 before therapy versus 61 +/- 11 after therapy; P <.01) and peak exercise (144 +/- 20 before therapy versus 127 +/- 17 after therapy; P <.01), which indicated a reduction in sympathetic drive. No differences were observed in heart rate responses in the placebo group. No differences were observed within or between groups in peak oxygen uptake, although work rate achieved was higher (117.9 +/- 36 watts versus 146.1 +/- 33 watts; P <.05) and exercise time tended to be higher (9.1 +/- 1.7 minutes versus 11.4 +/- 2.8 minutes; P =.06) in the bisoprolol fumarate group. A trend for a reduction in left ventricular end-diastolic volume (-54 mL) and left ventricular end-systolic volume (-62 mL) in the bisoprolol fumarate group occurred after 1 year. Ejection fraction was higher in the bisoprolol fumarate group (25.0 +/- 7 versus 36.2 +/- 9%; P <.05), and the placebo group remained unchanged. Most changes in volume and ejection fraction occurred during the latter 6 months of treatment. With myocardial tagging, insignificant reductions in left ventricular rotation velocity were observed in both groups, whereas relaxation velocity was reduced only after bisoprolol fumarate therapy (by 39%; P <.05). CONCLUSION: One year of bisoprolol fumarate therapy resulted in an improvement in exercise capacity, showed trends for reductions in end-diastolic and end-systolic volumes, increased ejection fraction, and significantly reduced relaxation velocity. Although these results generally confirm the beneficial effects of beta-blockade in patients with chronic heart failure, they show differential effects on systolic and diastolic function.
Resumo:
BACKGROUND: Minor protease inhibitor (PI) mutations often exist as polymorphisms in HIV-1 sequences from treatment-naïve patients. Previous studies showed that their presence impairs the antiretroviral treatment (ART) response. Evaluating these findings in a larger cohort is essential. METHODS: To study the impact of minor PI mutations on time to viral suppression and time to virological failure, we included patients from the Swiss HIV Cohort Study infected with HIV-1 subtype B who started first-line ART with a PI and two nucleoside reverse transcriptase inhibitors. Cox regression models were performed to compare the outcomes among patients with 0 and ≥ 1 minor PI mutation. Models were adjusted for baseline HIV-1 RNA, CD4 cell count, sex, transmission category, age, ethnicity, year of ART start, the presence of nucleoside reverse transcriptase inhibitor mutations, and stratified for the administered PIs. RESULTS: We included 1199 patients of whom 944 (78.7%) received a boosted PI. Minor PI mutations associated with the administered PI were common: 41.7%, 16.1%, 4.7% and 1.9% had 1, 2, 3 or ≥ 4 mutations, respectively. The time to viral suppression was similar between patients with 0 (reference) and ≥ 1 minor PI mutation (multivariable hazard ratio (HR): 1.1 [95% confidence interval (CI): 1.0-1.3], P = .196). The time to virological failure was also similar (multivariable HR:.9 [95% CI:.5-1.6], P = .765). In addition, the impact of each single minor PI mutation was analyzed separately: none was significantly associated with the treatment outcome. CONCLUSIONS: The presence of minor PI mutations at baseline has no effect on the therapy outcome in HIV infected individuals.
Resumo:
BACKGROUND: Determining a specific death cause may facilitate individualized therapy in patients with heart failure (HF). Cardiac resynchronization therapy (CRT) decreased mortality in the Cardiac Resynchronization in Heart Failure trial by reducing pump failure and sudden cardiac death (SCD). This study analyzes predictors of specific causes of death. METHODS AND RESULTS: Univariate and multivariate analyses used 8 baseline and 3-month post-randomization variables to predict pump failure and SCD (categorized as "definite," "probable," and "possible"). Of 255 deaths, 197 were cardiovascular. There were 71 SCDs with a risk reduction by CRT of 0.47 (95% confidence interval 0.29-0.76; P = .002) with similar reductions in SCD classified as definite, probable, and possible. Univariate SCD predictors were 3-month HF status (mitral regurgitation [MR] severity, plasma brain natriuretic peptide [BNP], end-diastolic volume, and systolic blood pressure), whereas randomization to CRT decreased risk. Multivariate SCD predictors were randomization to CRT 0.56 (0.53-0.96, P = .035) and 3-month MR severity 1.82 (1.77-2.60, P = .0012). Univariate pump failure death predictors related to baseline HF state (quality of life score, interventricular mechanical delay, end-diastolic volume, plasma BNP, MR severity, and systolic pressure), whereas randomization to CRT and nonischemic cardiomyopathy decreased risk; multivariate predictors of pump failure death were baseline plasma BNP and systolic pressure and randomization to CRT. CONCLUSION: CRT decreased SCD in patients with systolic HF and ventricular dyssynchrony. SCD risk was increased with increased severity of MR (including the 3-month value for MR as a time-dependent covariate) and reduced by randomization to CRT. HF death was increased related to the level of systolic blood pressure, log BNP, and randomization to CRT. These results emphasize the importance and interdependence of HF severity to mortality from pump failure and SCD.
Resumo:
Background:Type 2 diabetes (T2D) is associated with increased fracture risk but paradoxically greater BMD. TBS (trabecular bone score), a novel grey-level texture measurement extracted from DXA images, correlates with 3D parameters of bone micro-architecture. We evaluated the ability of lumbar spine (LS) TBS to account for the increased fracture risk in diabetes. Methods:29,407 women ≥50 years at the time of baseline hip and spine DXA were identified from a database containing all clinical BMD results for the Province of Manitoba, Canada. 2,356 of the women satisfied a well-validated definition for diabetes, the vast majority of whom (>90%) would have T2D. LS L14 TBS was derived for each spine DXA examination blinded to clinical parameters and outcomes. Health service records were assessed for incident non-traumatic major osteoporotic fracture codes (mean follow-up 4.7 years). Results:In linear regression adjusted for FRAX risk factors (age,BMI, glucocorticoids, prior major fracture, rheumatoid arthritis, COPD as a smoking proxy, alcohol abuse) and osteoporosis therapy, diabetes was associated with higher BMD for LS, femoral neck and total hip but lower LS TBS (all p<0.001). Similar results were seen after excluding obese subjects withBMI>30. In logistic regression (Figure), the adjusted odds ratio (OR) for a skeletal measurement in the lowest vs highest tertile was less than 1 for all BMD measurements but increased for LS TBS (adjusted OR 2.61, 95%CI 2.30-2.97). Major osteoporotic fractures were identified in 175 (7.4%) with and 1,493 (5.5%) without diabetes (p < 0.001). LS TBS predicted fractures in those with diabetes (adjusted HR 1.27, 95%CI 1.10-1.46) and without diabetes (HR 1.31, 95%CI 1.24-1.38). LS TBS was an independent predictor of fracture (p<0.05) when further adjusted for BMD (LS, femoral neck or total hip). The explanatory effect of diabetes in the fracture prediction model was greatly reduced when LS TBS was added to the model (indicating that TBS captured a large portion of the diabetes-associated risk), but was paradoxically increased from adding any of the BMD measurements. Conclusions:Lumbar spine TBS is sensitive to skeletal deterioration in postmenopausal women with diabetes, whereas BMD is paradoxically greater. LS TBS predicts osteoporotic fractures in those with diabetes, and captures a large portion of the diabetes-associated fracture risk. Combining LS TBS with BMD incrementally improves fracture prediction.
Resumo:
Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the trabecular bone score (TBS) measure. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis values, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goal of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. We included 631 women: mean age 67.4 ± 6.7 years, BMI 26.1 ± 4.6, mean lumbar spine BMD 0.943 ± 0.168 (T-score − 1.4 SD), and TBS 1.271 ± 0.103. As expected, correlation between BMD and site matched TBS is low (r2 = 0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2-2.5), 1.6 (1.2-2.1), and 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), and 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < − 2.5 SD or a TBS < 1.200. If we combine a BMD < − 2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been misclassified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS and HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.
Resumo:
Using a large prospective cohort of over 12,000 women, we determined 2 thresholds (high risk and low risk of hip fracture) to use in a 10-yr hip fracture probability model that we had previously described, a model combining the heel stiffness index measured by quantitative ultrasound (QUS) and a set of easily determined clinical risk factors (CRFs). The model identified a higher percentage of women with fractures as high risk than a previously reported risk score that combined QUS and CRF. In addition, it categorized women in a way that was quite consistent with the categorization that occurred using dual X-ray absorptiometry (DXA) and the World Health Organization (WHO) classification system; the 2 methods identified similar percentages of women with and without fractures in each of their 3 categories, but the 2 identified only in part the same women. Nevertheless, combining our composite probability model with DXA in a case findings strategy will likely further improve the detection of women at high risk of fragility hip fracture. We conclude that the currently proposed model may be of some use as an alternative to the WHO classification criteria for osteoporosis, at least when access to DXA is limited.
Resumo:
Plasma urate levels are higher in humans than rodents (240-360 vs. â^¼30 μM) because humans lack the liver enzyme uricase. High uricemia in humans may protect against oxidative stress, but hyperuricemia also associates with the metabolic syndrome, and urate and uric acid can crystallize to cause gout and renal dysfunctions. Thus, hyperuricemic animal models to study urate-induced pathologies are needed. We recently generated mice with liver-specific ablation of Glut9, a urate transporter providing access of urate to uricase (LG9KO mice). LG9KO mice had moderately high uricemia (â^¼120 μM). To further increase their uricemia, here we gavaged LG9KO mice for 3 days with inosine, a urate precursor; this treatment was applied in both chow- and high-fat-fed mice. In chow-fed LG9KO mice, uricemia peaked at 300 μM 2 h after the first gavage and normalized 24 h after the last gavage. In contrast, in high-fat-fed LG9KO mice, uricemia further rose to 500 μM. Plasma creatinine strongly increased, indicating acute renal failure. Kidneys showed tubule dilation, macrophage infiltration, and urate and uric acid crystals, associated with a more acidic urine. Six weeks after inosine gavage, plasma urate and creatinine had normalized. However, renal inflammation, fibrosis, and organ remodeling had developed despite the disappearance of urate and uric acid crystals. Thus, hyperuricemia and high-fat diet feeding combined to induce acute renal failure. Furthermore, a sterile inflammation caused by the initial crystal-induced lesions developed despite the disappearance of urate and uric acid crystals.
Resumo:
PURPOSE: Positron emission tomography with (18)F-fluorodeoxyglucose (FDG-PET) was used to evaluate treatment response in patients with gastrointestinal stromal tumors (GIST) after administration of sunitinib, a multitargeted tyrosine kinase inhibitor, after imatinib failure. PATIENTS AND METHODS: Tumor metabolism was assessed with FDG-PET before and after the first 4 weeks of sunitinib therapy in 23 patients who received one to 12 cycles of sunitinib therapy (4 weeks of 50 mg/d, 2 weeks off). Treatment response was expressed as the percent change in maximal standardized uptake values (SUV). The primary end point of time to tumor progression was compared with early PET results on the basis of traditional Response Evaluation Criteria in Solid Tumors (RECIST) criteria. RESULTS: Progression-free survival (PFS) was correlated with early FDG-PET metabolic response (P < .0001). Using -25% and +25% thresholds for SUV variations from baseline, early FDG-PET response was stratified in metabolic partial response, metabolically stable disease, or metabolically progressive disease; median PFS rates were 29, 16, and 4 weeks, respectively. Similarly, when a single FDG-PET positive/negative was considered after 4 weeks of sunitinib, the median PFS was 29 weeks for SUVs less than 8 g/mL versus 4 weeks for SUVs of 8 g/mL or greater (P < .0001). None of the patients with metabolically progressive disease subsequently responded according to RECIST criteria. Multivariate analysis showed shorter PFS in patients who had higher residual SUVs (P < .0001), primary resistance to imatinib (P = .024), or nongastric GIST (P = .002), regardless of the mutational status of the KIT and PDGFRA genes. CONCLUSION: Week 4 FDG-PET is useful for early assessment of treatment response and for the prediction of clinical outcome. Thus, it offers opportunities to individualize and optimize patient therapy.