120 resultados para Celiac Disease -- blood
Resumo:
Erythropoietin (EPO) and iron deficiency as causes of anemia in patients with limited renal function or end-stage renal disease are well addressed. The concomitant impairment of red blood cell (RBC) survival has been largely neglected. Properties of the uremic environment like inflammation, increased oxidative stress and uremic toxins seem to be responsible for the premature changes in RBC membrane and cytoskeleton. The exposure of antigenic sites and breakdown of the phosphatidylserine asymmetry promote RBC phagocytosis. While the individual response to treatment with EPO-stimulating agents (ESA) depends on both the RBC's lifespan and the production rate, uniform dosing algorithms do not meet that demand. The clinical use of mathematical models predicting ESA-induced changes in hematocrit might be greatly improved once independent estimates of RBC production rate and/or lifespan become available, thus making the concomitant estimation of both parameters unnecessary. Since heme breakdown by the hemoxygenase pathway results in carbon monoxide (CO) which is exhaled, a simple CO breath test has been used to calculate hemoglobin turnover and therefore RBC survival and lifespan. Future research will have to be done to validate and implement this method in patients with kidney failure. This will result in new insights into RBC kinetics in renal patients. Eventually, these findings are expected to improve our understanding of the hemoglobin variability in response to ESA.
Resumo:
OBJECTIVE: A case of Lhermitte-Duclos disease (LDD, dysplastic gangliocytoma) with atypical vascularization is reported. LDD is a rare cerebellar mass lesion which may be associated with Cowden's syndrome and the PTEN germline mutation. CASE MATERIAL: A 61-year-old male presented 15 years before with a transient episode of unspecific gait disturbance. Initial magnetic resonance (MR) imaging revealed a right-sided, diffuse, nonenhancing cerebellar mass lesion. No definitive diagnosis was made at that time, and the symptoms resolved spontaneously. 15 years later, the patient presented with acute onset of vomiting associated with headache and ataxic gait. MR imaging showed a progression of the lesion with occlusive hydrocephalus. The lesion depicted a striated pattern characteristic for LDD with T1-hypointense and T2-hyperintense bands, nonenhancing with contrast. After resection of the mass lesion, the cerebellar and hydrocephalic symptoms improved rapidly. The pathological examination confirmed the diagnosis of dysplastic gangliocytoma (WHO Grade I) with enlarged granular and molecular cell layers, reactive gliosis and dysplastic blood vessels. No other clinical features associated with Cowden's syndrome were present. CONCLUSIONS: This case illustrates that LDD with atypical vascularization is a slow-growing posterior fossa mass lesion which may remain asymptomatic for many years. Timing of surgical treatment and extent of resection in patients with LDD is controversial. The typical features on standard T1-/T2-weighted MR imaging allow a diagnosis without surgery in most cases. The authors believe that the decision to treat in these cases should be based on clinical deterioration.
Resumo:
BACKGROUND: Being a caregiver for a spouse with Alzheimer's disease is associated with increased risk for cardiovascular illness, particularly for males. This study examined the effects of caregiver gender and severity of the spouse's dementia on sleep, coagulation, and inflammation in the caregiver. METHODS: Eighty-one male and female spousal caregivers and 41 non-caregivers participated (mean age of all participants 70.2 years). Full-night polysomnography (PSG) was recorded in each participants home. Severity of the Alzheimer's disease patient's dementia was determined by the Clinical Dementia Rating (CDR) scale. The Role Overload scale was completed as an assessment of caregiving stress. Blood was drawn to assess circulating levels of D-dimer and Interleukin-6 (IL-6). RESULTS: Male caregivers who were caring for a spouse with moderate to severe dementia spent significantly more time awake after sleep onset than female caregivers caring for spouses with moderate to severe dementia (p=.011), who spent a similar amount of time awake after sleep onset to caregivers of low dementia spouses and to non-caregivers. Similarly, male caregivers caring for spouses with worse dementia had significantly higher circulating levels of D-dimer (p=.034) than females caring for spouses with worse dementia. In multiple regression analysis (adjusted R(2)=.270, p<.001), elevated D-dimer levels were predicted by a combination of the CDR rating of the patient (p=.047) as well as greater time awake after sleep onset (p=.046). DISCUSSION: The findings suggest that males caring for spouses with more severe dementia experience more disturbed sleep and have greater coagulation, the latter being associated with the disturbed sleep. These findings may provide insight into why male caregivers of spouses with Alzheimer's disease are at increased risk for illness, particularly cardiovascular disease.
Resumo:
BACKGROUND: Elderly individuals who provide care to a spouse suffering from dementia bear an increased risk of coronary heart disease (CHD). OBJECTIVE: To test the hypothesis that the Framingham CHD Risk Score would be higher in dementia caregivers relative to non-caregiving controls. METHODS: We investigated 64 caregivers providing in-home care for their spouse with Alzheimer's disease and 41 gender-matched non-caregiving controls. All subjects (mean age 70 +/- 8 years, 75% women, 93% Caucasian) had a negative history of CHD and cerebrovascular disease. The original Framingham CHD Risk Score was computed adding up categorical scores for age, blood lipids, blood pressure, diabetes, and smoking with adjustment made for sex. RESULTS: The average CHD risk score was higher in caregivers than in controls even when co-varying for socioeconomic status, health habits, medication, and psychological distress (8.0 +/- 2.9 vs. 6.3 +/- 3.0 points, p = 0.013). The difference showed a medium effect size (Cohen's d = 0.57). A relatively higher blood pressure in caregivers than in controls made the greatest contribution to this difference. The probability (area under the receiver operator curve) that a randomly selected caregiver had a greater CHD risk score than a randomly selected non-caregiver was 65.5%. CONCLUSIONS: Based on the Framingham CHD Risk Score, the potential to develop overt CHD in the following 10 years was predicted to be greater in dementia caregivers than in non-caregiving controls. The magnitude of the difference in the CHD risk between caregivers and controls appears to be clinically relevant. Clinicians may want to monitor caregiving status as a routine part of standard evaluation of their elderly patients' cardiovascular risk.
Resumo:
BACKGROUND: The question whether patients suffering from end-stage emphysema who are candidates for lung transplantation should be treated with a single lung or with a double lung transplantation is still unanswered. METHODS: We reviewed 24 consecutive lung transplant procedures, comparing the results of 6 patients with an unilateral and 17 with a bilateral transplantation. PATIENTS AND RESULTS: After bilateral transplantation the patients showed a trend towards better blood gas exchange with shorter time on ventilator and intensive care compared patients after unilateral procedure. Three-year-actuarial survival was higher in the group after bilateral transplantation (83% versus 67%). There was a continuous improvement in pulmonary function in both groups during the first months after transplantation. Vital capacity and forced exspiratory ventilation therapies during the first second were significantly higher in the bilateral transplant group. CONCLUSION: Both unilateral and bilateral transplantation are feasible for patients with end-stage emphysema. Bilateral transplantation results in better pulmonary reserve capacity and faster rehabilitation.
Resumo:
The current organ shortage in transplantation medicine stimulates the exploration of new strategies to expand the donor pool including the utilisation of living donors, ABO-incompatible grafts, and xenotransplantation. Preformed natural antibodies (Ab) such as anti-Gal or anti-A/B Ab mediate hyperacute graft rejection and thus represent a major hurdle to the employment of such strategies. In contrast to solid organ transplantation (SOT), ABO blood group incompatibilities are of minor importance in haematopoietic stem cell transplantation (HSCT). Thus, ABO incompatible HSCT may serve as an in vivo model to study carbohydrate antigen (Ag)-mismatched transplantations such as ABO-incompatible SOT or the effect of preformed Ab against Gal in xenotransplantation. This mini-review summarises our clinical and experimental studies performed with the support of the Swiss National Science Foundation program on Implants and Transplants (NFP-46). Part 1 describes data on the clinical outcome of ABO-incompatible HSCT, in particular the incidence of several immunohaematological complications, acute graft-versus-host-disease (GvHD), and the overall survival. Part 2 summarises the measurements of anti-A/B Ab in healthy blood donors and ABO-incompatible HSCT using a novel flow cytometry based method and the potential mechanisms responsible for the loss of anti-A/B Ab observed following minor ABO-incompatible HSCT, ie the occurrence of humoral tolerance. Part 3 analyses the potential of eliminating Gal expression as well as specific complement inhibitors such as dextran sulfate and synthetic tyrosine analogues to protect porcine endothelial cells from xenoreactive Ab-mediated damage in vitro and in a hamster-to-rat heart transplantation model. In conclusion, due to similarities of the immunological hurdles of ABO incompatible transplantations and xenotransplantation, the knowledge obtained from both fields might lead to new strategies to overcome humoral rejection in transplantation.
Resumo:
BACKGROUND: Myocardial contrast echocardiography (MCE) is able to measure in vivo relative blood volume (rBV, i.e., capillary density), and its exchange frequency b, the constituents of myo-cardial blood flow (MBF, ml min-1 g-1). This study aimed to assess, by MCE, whether left ventricular hypertrophy (LVH) in hypertrophic cardiomyopathy (HCM) can be differentiated from LVH in triathletes (athlete's heart, AH) or from hypertensive heart disease patients (HHD). METHODS: Sixty individuals, matched for age (33 +/- 10 years) and gender, and subdivided into four groups (n = 15) were examined: HCM, AH, HHD and a group of sedentary individuals without LVH (S). rBV (ml ml-1), b (min-1) and MBF, at rest and during adenosine-induced hyperaemia, were derived by MCE in mid septal, lateral and inferior regions. The ratio of MBF during hyperaemia and MBF at rest yielded myocardial blood flow reserve (MBFR). RESULTS: Septal wall rBV at rest was lower in HCM (0.084 +/- 0.023 ml ml-1) than in AH (0.151 +/- 0.024 ml ml-1, p <0.01) and in S (0.129 +/- 0.026 ml ml-1, p <0.01), but was similar to HHD (0.097 +/- 0.016 ml ml-1). Conversely, MBFR was lowest in HCM (1.67 +/- 0.93), followed by HHD (2.8 +/- 0.93, p <0.01), by S (3.36 +/- 1.03, p <0.001) and by AH (4.74 +/- 1.46, p <0.0001). At rest, rBV <0.11 ml ml-1 accurately distinguished between HCM and AH (sensitivity 99%, specificity 99%), similarly MBFR < or =1.8 helped to distinguish between HCM and HHD (sensitivity 100%, specificity 77%). CONCLUSIONS: rBV at rest, most accurately distinguishes between pathological LVH due to HCM and physiological, endurance-exercise induced LVH.
Resumo:
BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.
Resumo:
Chronic myeloid leukemia (CML) is a malignant myeloproliferative disease with a characteristic chronic phase (cp) of several years before progression to blast crisis (bc). The immune system may contribute to disease control in CML. We analyzed leukemia-specific immune responses in cpCML and bcCML in a retroviral-induced murine CML model. In the presence of cpCML and bcCML expressing the glycoprotein of lymphocytic choriomeningitis virus as a model leukemia antigen, leukemia-specific cytotoxic T lymphocytes (CTLs) became exhausted. They maintained only limited cytotoxic activity, and did not produce interferon-gamma or tumor necrosis factor-alpha or expand after restimulation. CML-specific CTLs were characterized by high expression of programmed death 1 (PD-1), whereas CML cells expressed PD-ligand 1 (PD-L1). Blocking the PD-1/PD-L1 interaction by generating bcCML in PD-1-deficient mice or by repetitive administration of alphaPD-L1 antibody prolonged survival. In addition, we found that PD-1 is up-regulated on CD8(+) T cells from CML patients. Taken together, our results suggest that blocking the PD-1/PD-L1 interaction may restore the function of CML-specific CTLs and may represent a novel therapeutic approach for CML.
Resumo:
OBJECTIVES: To report a novel observation of neutrophil signal transduction abnormalities in patients with localized aggressive periodontitis (LAP) that are associated with an enhanced phosphorylation of the nuclear signal transduction protein cyclic AMP response element-binding factor (CREB). METHOD AND MATERIALS: Peripheral venous blood neutrophils of 18 subjects, 9 patients with LAP and 9 race-, sex-, and age-matched healthy controls, were isolated and prepared using the Ficoll-Hypaque density-gradient technique. Neutrophils (5.4 x 10(6)/mL) were stimulated with the chemoattractant FMLP (10(-6) mol/L) for 5 minutes and lysed. Aliquots of these samples were separated by SDS-PAGE (60 microg/lane) on 9.0% (w/v) polyacrylamide slab gels and transferred electrophoretically to polyvinyl difluoride membranes. The cell lysates were immunoblotted with a 1:1,000 dilution of rabbit-phospho-CREB antibody that recognizes only the phosphorylated form of CREB at Ser133. The activated CREB was visualized with a luminol-enhanced chemoluminescence detection system and evaluated by laser densitometry. RESULTS: In patients with LAP, the average activation of CREB displayed an overexpression for the unstimulated peripheral blood neutrophils of 80.3% (17.5-fold) compared to healthy controls (4.6%). CONCLUSION: LAP neutrophils who express their phenotype appear to be constitutively primed, as evidenced by activated CREB in resting cells compared to normal individuals. The genetically primed neutrophil phenotype may contribute to neutrophil-mediated tissue damage in the pathogenesis of LAP.
Resumo:
The National Institutes of Health (NIH) classification of graft-versus-host disease (GVHD) is a significant improvement over prior classifications, and has prognostic implications. We hypothesized that the NIH classification of GVHD would predict the survival of patients with GVHD treated with extracorporeal photopheresis (ECP). Sixty-four patients with steroid refractory/dependent GVHD treated with ECP were studied. The 3-year overall survival (OS) was 36% (95% confidence interval [CI] 13-59). Progressive GVHD was seen in 39% of patients with any acute GVHD (aGVHD) (classic acute, recurrent acute, overlap) compared to 3% of patients with classic chronic GVHD (cGVHD) (P=.002). OS was superior for patients with classic cGVHD (median survival, not reached) compared to overlap GVHD (median survival, 395 days, 95% CI 101 to not reached) and aGVHD (delayed, recurrent or persistent) (median survival, 72 days, 95% CI 39-152). In univariate analyses, significant predictors of survival after ECP included GVHD subtype, bilirubin, platelet count, and steroid dose. In multivariate analyses overlap plus classic cGVHD was an independent prognostic feature predictive of superior survival (hazard ratio [HR] 0.34, 95% CI 0.14-0.8, p=.014). This study suggests that NIH classification can predict outcome after ECP for steroid refractory/dependent GVHD.
Resumo:
To prospectively compare the diagnostic accuracy of steady-state, high-spatial-resolution magnetic resonance (MR) angiography of the lower leg, performed with a blood pool contrast agent, with selective digital subtraction angiography (DSA) as the reference standard in patients with symptomatic peripheral arterial disease.
Resumo:
BACKGROUND AND AIMS: The splanchnic circulation has an important function in the body under both physiological and pathophysiological conditions. Despite its importance, no reliable noninvasive procedures for estimating splanchnic circulation have been established. The aim of this study was to evaluate MRI as a tool for assessing intra-abdominal blood flows of the aorta, portal vein (VPO) and the major intestinal and hepatic vessels. METHODS: In nine healthy volunteers, the proximal aorta (AOP) and distal abdominal aorta (AOD), superior mesenteric artery (SAM), celiac trunk (CTR), hepatic arteries (common and proper hepatic arteries, AHC and AHP, respectively), and VPO were localized on contrast-enhanced magnetic resonance angiography images. Volumetric flow was measured using a two-dimensional cine echocardiogram-gated phase contrast technique. Measurements were taken before and 30 min after continuous intravenous infusion of somatostatin (250 microg/h) and were independently evaluated by two investigators. RESULTS: Blood flow measured by MRI in the VPO, SAM, AOP, AHP, and CTR significantly decreased after drug infusion. Flows in the AOD and AHC showed a tendency to decrease (P>0.05). Interrater agreement on flows in MRI was very good for large vessels (VPO, AOP, and AOD), with a concordance correlation coefficient of 0.94, as well as for smaller vessels such as the CTR, AHC, AHP, and SAM (concordance correlation coefficient =0.78). CONCLUSION: Somatostatin-induced blood flow changes in the splanchnic region were reliably detected by MRI. MRI may be useful for the noninvasive assessment of blood flow changes in the splanchnic region.
Resumo:
OBJECTIVE: To evaluate the association between arterial blood pressure (ABP) during the first 24 h and mortality in sepsis. DESIGN: Retrospective cohort study. SETTING: Multidisciplinary intensive care unit (ICU). PATIENTS AND PARTICIPANTS: A total of 274 septic patients. INTERVENTIONS: None. MEASUREMENTS AND RESULTS: Hemodynamic, and laboratory parameters were extracted from a PDMS database. The hourly time integral of ABP drops below clinically relevant systolic arterial pressure (SAP), mean arterial pressure (MAP), and mean perfusion pressure (MPP = MAP - central venous pressure) levels was calculated for the first 24 h after ICU admission and compared with 28-day-mortality. Binary and linear regression models (adjusted for SAPS II as a measure of disease severity), and a receiver operating characteristic (ROC) analysis were applied. The areas under the ROC curve were largest for the hourly time integrals of ABP drops below MAP 60 mmHg (0.779 vs. 0.764 for ABP drops below MAP 55 mmHg; P < or = 0.01) and MPP 45 mmHg. No association between the hourly time integrals of ABP drops below certain SAP levels and mortality was detected. One or more episodes of MAP < 60 mmHg increased the risk of death by 2.96 (CI 95%, 1.06-10.36, P = 0.04). The area under the ROC curve to predict the need for renal replacement therapy was highest for the hourly time integral of ABP drops below MAP 75 mmHg. CONCLUSIONS: A MAP level > or = 60 mmHg may be as safe as higher MAP levels during the first 24 h of ICU therapy in septic patients. A higher MAP may be required to maintain kidney function.
Resumo:
INTRODUCTION: It is unclear to which level mean arterial blood pressure (MAP) should be increased during septic shock in order to improve outcome. In this study we investigated the association between MAP values of 70 mmHg or higher, vasopressor load, 28-day mortality and disease-related events in septic shock. METHODS: This is a post hoc analysis of data of the control group of a multicenter trial and includes 290 septic shock patients in whom a mean MAP > or = 70 mmHg could be maintained during shock. Demographic and clinical data, MAP, vasopressor requirements during the shock period, disease-related events and 28-day mortality were documented. Logistic regression models adjusted for the geographic region of the study center, age, presence of chronic arterial hypertension, simplified acute physiology score (SAPS) II and the mean vasopressor load during the shock period was calculated to investigate the association between MAP or MAP quartiles > or = 70 mmHg and mortality or the frequency and occurrence of disease-related events. RESULTS: There was no association between MAP or MAP quartiles and mortality or the occurrence of disease-related events. These associations were not influenced by age or pre-existent arterial hypertension (all P > 0.05). The mean vasopressor load was associated with mortality (relative risk (RR), 1.83; confidence interval (CI) 95%, 1.4-2.38; P < 0.001), the number of disease-related events (P < 0.001) and the occurrence of acute circulatory failure (RR, 1.64; CI 95%, 1.28-2.11; P < 0.001), metabolic acidosis (RR, 1.79; CI 95%, 1.38-2.32; P < 0.001), renal failure (RR, 1.49; CI 95%, 1.17-1.89; P = 0.001) and thrombocytopenia (RR, 1.33; CI 95%, 1.06-1.68; P = 0.01). CONCLUSIONS: MAP levels of 70 mmHg or higher do not appear to be associated with improved survival in septic shock. Elevating MAP >70 mmHg by augmenting vasopressor dosages may increase mortality. Future trials are needed to identify the lowest acceptable MAP level to ensure tissue perfusion and avoid unnecessary high catecholamine infusions.