982 resultados para Health Sciences, Medicine and Surgery|Health Sciences, Epidemiology|Health Sciences, Immunology
Resumo:
Introduction and objective. A number of prognostic factors have been reported for predicting survival in patients with renal cell carcinoma. Yet few studies have analyzed the effects of those factors at different stages of the disease process. In this study, different stages of disease progression starting from nephrectomy to metastasis, from metastasis to death, and from evaluation to death were evaluated. ^ Methods. In this retrospective follow-up study, records of 97 deceased renal cell carcinoma (RCC) patients were reviewed between September 2006 to October 2006. Patients with TNM Stage IV disease before nephrectomy or with cancer diagnoses other than RCC were excluded leaving 64 records for analysis. Patient TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were analyzed in relation to time to metastases. Time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from metastases to death. Finally, analysis of laboratory values at time of evaluation, Eastern Cooperative Oncology Group performance status (ECOG), UCLA Integrated Staging System (UISS), time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from evaluation to death. Linear regression and Cox Proportional Hazard (univariate and multivariate) was used for testing significance. Kaplan-Meier Log-Rank test was used to detect any significance between groups at various endpoints. ^ Results. Compared to negative lymph nodes at time of nephrectomy, a single positive lymph node had significantly shorter time to metastasis (p<0.0001). Compared to other histological types, clear cell histology had significant metastasis free survival (p=0.003). Clear cell histology compared to other types (p=0.0002 univariate, p=0.038 multivariate) and time to metastasis with log conversion (p=0.028) significantly affected time from metastasis to death. A greater than one year and greater than two year metastasis free interval, compared to patients that had metastasis before one and two years, had statistically significant survival benefit (p=0.004 and p=0.0318). Time from evaluation to death was affected by greater than one year metastasis free interval (p=0.0459), alcohol consumption (p=0.044), LDH (p=0.006), ECOG performance status (p<0.001), and hemoglobin level (p=0.0092). The UISS risk stratified the patient population in a statistically significant manner for survival (p=0.001). No other factors were found to be significant. ^ Conclusion. Clear cell histology is predictive for both time to metastasis and metastasis to death. Nodal status at time of nephrectomy may predict risk of metastasis. The time interval to metastasis significantly predicts time from metastasis to death and time from evaluation to death. ECOG performance status, and hemoglobin levels predicts survival outcome at evaluation. Finally, UISS appropriately stratifies risk in our population. ^
Resumo:
Background. The association between a prior history of atopy or other autoimmune diseases and risk of alopecia areata is not well established. ^ Objective. Purpose of this study was to use the National Alopecia Areata Registry database to further investigate the association between history of atopy or other autoimmune diseases and risk of alopecia areata. ^ Methods. A total of 2,613 self-registered sporadic cases (n = 2,055) and controls (n = 558) were included in the present analysis. ^ Results. Possessing a history of any atopy (OR = 2.00; 95% CI 1.50-2.54) or autoimmune disease (OR = 1.73; 95% CI 1.10-2.72) was associated with an increased risk of alopecia areata. There was no trend for possessing a history of more than one atopy or autoimmune disease and increasing risk of alopecia areata. ^ Limitations. Recall, reporting, and recruiting bias are potential sources of limitations in this analysis. ^ Conclusion. This analysis revealed that a prior history of atopy and autoimmune disease was associated with an increased risk of alopecia areata and that the results were consistent for both the severe subtype of alopecia areata (i.e., alopecia totalis and alopecia universalis) and the localized subtype (i.e., alopecia areata persistent).^
Resumo:
Context: Despite tremendous strides in HIV treatment over the past decade, resistance remains a major problem. A growing number of patients develop resistance and require new therapies to suppress viral replication. ^ Objective: To assess the safety of multiple administrations of the anti-CD4 receptor (anti-CD4) monoclonal antibody ibalizumab given as intravenous (IV) infusions, in three dosage regimens, in subjects infected with human immunodeficiency virus (HIV-1). ^ Design: Phase 1, multi-center, open-label, randomized clinical trial comparing the safety, pharmacokinetics and antiviral activity of three dosages of ibalizumab. ^ Setting: Six clinical trial sites in the United States. ^ Participants: A total of twenty-two HIV-positive patients on no anti-retroviral therapy or a stable failing regimen. ^ Intervention: Randomized to one of two treatment groups in Arms A and B followed by non-randomized enrollment in Arm C. Patients randomized to Arm A received 10 mg/kg of ibalizumab every 7 days, for a total of 10 doses; patients randomized to Arm B received a total of six doses of ibalizumab; a single loading dose of 10 mg/kg on Day 1 followed by five maintenance doses of 6 mg/kg every 14 days, starting at Week 1. Patients assigned to Arm C received 25 mg/kg of ibalizumab every 14 days for a total of 5 doses. All patients were followed for safety for an additional 7 to 8 weeks. ^ Main Outcome Measures: Clinical and laboratory assessments of safety and tolerability of multiple administrations of ibalizumab in HIV-infected patients. Secondary measures of efficacy include HIV-1 RNA (viral load) measurements. ^ Results: 21 patients were treatment-experienced and 1 was naïve to HIV therapy. Six patients were failing despite therapy and 15 were on no current HIV treatment. Mean baseline viral load (4.78 log 10; range 3.7-5.9) and CD4+ cell counts (332/μL; range 89-494) were similar across cohorts. Mean peak decreases in viral load from baseline of 0.99 log10(1.11 log10, and 0.96 log 10 occurred by Wk 2 in Cohorts A, B and C, respectively. Viral loads decreased by >1.0 log10 in 64%; 4 patients viral loads were suppressed to < 400 copies/mL. Viral loads returned towards baseline by Week 9 with reduced susceptibility to ibalizumab. CD4+ cell counts rose transiently and returned toward baseline. Maximum median elevations above BL in CD4+ cell counts for Cohorts A, B and C were +257, +198 and +103 cells/μL, respectively and occurred within 3 Wks in 16 of 22 subjects. The half-life of ibalizumab was 3-3.5 days and elimination was characteristic of capacity-limited kinetics. Administration of ibalizumab was well tolerated. Four serious adverse events were reported during the study. None of these events were related to study drug. Headache, nausea and cough were the most frequently reported treatment emergent adverse events and there were no laboratory abnormalities related to study drug. ^ Conclusions: Ibalizumab administered either weekly or bi-weekly was safe, well tolerated, and demonstrated antiviral activity. Further studies with ibalizumab in combination with standard antiretroviral treatments are warranted.^
Resumo:
Hypertension in adults is defined by risk for cardiovascular morbidity and mortality, but in children, hypertension is defined using population norms. The diagnosis of hypertension in children and adolescents requires only casual blood pressure measurements, but the use of ambulatory blood pressure monitoring to further evaluate patients with elevated blood pressure has been recommended in the Fourth Report on the Diagnosis, Evaluation, and Treatment of High Blood Pressure in Children and Adolescents. The aim of this study is to assess the association between stage of hypertension (using both casual and 24 hour ambulatory blood pressure measurements) and target organ damage defined by left ventricular hypertrophy (LVH) in a sample of children and adolescents in Houston, TX. A retrospective analysis was performed on the primary de-identified data from the combination of participants in two, IRB approved, cross-sectional studies. The studies collected basic demographic data, height, weight, casual blood pressures, ambulatory blood pressures, and left ventricular measurements by echocardiography on children age 8 to 18 years old. Hypertension was defined and staged using the criteria for ambulatory blood pressure reported by Lurbe et al. [1] with some modification. Left ventricular hypertrophy was defined using left ventricular mass index (LVMI) criteria specific for children and adults. The pediatric criterion was LVMI2.7 > 95th percentile for gender and the adult criterion was LVMI2.7 > 51g/m2.7. Participants from the original studies were included in this analysis if they had complete demographic information, anthropometric measures, casual blood pressures, ambulatory blood pressures, and echocardiography data. There were 241 children and adolescents included: 19.1% were normotensive, 17.0% had white coat hypertension, 11.6% had masked hypertension, and 52.4% had confirmed hypertension. Of those with hypertension, 22.4% had stage 1 hypertension, 5.8% had stage 2 hypertension, and 24.1% had stage 3 hypertension. Participants with confirmed hypertension were more likely to have LVH by pediatric criterion than those who were normotensive [OR 2.19, 95% CI (1.04–4.63)]; LVH defined by adult criterion did not differ significantly in normotensives compared with hypertensives [OR 2.08, 95% CI (0.58–7.52)]. However, there was a significant trend in the increased prevalence of LVH across the six blood pressure categories for LVH defined by both pediatric and adult criteria (p < 0.001 and p = 0.02, respectively). Additionally, the mean LVM indexed by height 2.7 had a significantly increased trend across blood pressure stages from normal to stage 3 hypertension (p < 0.02). Pediatric hypertension is defined using population norms, and although children with mild hypertension are not at increased odds of having target organ damage defined by LVH, those with severe hypertension are more likely to have LVH. Staging hypertension by ambulatory blood pressure further describes an individual's risk for LVH target organ damage. ^
Resumo:
Background. Cardiac tamponade can occur when a large amount of fluid, gas, singly or in combination, accumulating within the pericardium, compresses the heart causing circulatory compromise. Although previous investigators have found the 12-lead ECG to have a poor predictive value in diagnosing cardiac tamponade, very few studies have evaluated it as a follow up tool for ruling in or ruling out tamponade in patients with previously diagnosed malignant pericardial effusions. ^ Methods. 127 patients with malignant pericardial effusions at the MD Anderson Cancer Center were included in this retrospective study. While 83 of these patients had a cardiac tamponade diagnosed by echocardiographic criteria (Gold standard), 44 did not. We computed the sensitivity (Se), specificity (Sp), positive (PPV) and negative predictive values (NPV) for individual and combinations of ECG abnormalities. Individual ECG abnormalities were also entered singly into a univariate logistic regression model to predict tamponade. ^ Results. For patients with effusions of all sizes, electrical alternans had a Se, Sp, PPV and NPV of 22.61%, 97.61%, 95% and 39.25% respectively. These parameters for low voltage complexes were 55.95%, 74.44%, 81.03%, 46.37% respectively. The presence of all three ECG abnormalities had a Se = 8.33%, Sp = 100%, PPV = 100% and NPV = 35.83% while the presence of at least one of the three ECG abnormalities had a Se = 89.28%, Sp = 46.51%, PPV = 76.53%, NPV = 68.96%. For patients with effusions of all sizes electrical alternans had an OR of 12.28 (1.58–95.17, p = 0.016), while the presence of at least one ECG abnormality had an OR of 7.25 (2.9–18.1, p = 0.000) in predicting tamponade. ^ Conclusions. Although individual ECG abnormalities had low sensitivities, specificities, NPVs and PPVs with the exception of electrical alternans, the presence of at least one of the three ECG abnormalities had a high sensitivity in diagnosing cardiac tamponade. This could point to its potential use as a screening test with a correspondingly high NPV to rule out a diagnosis of tamponade in patients with malignant pericardial effusions. This could save expensive echocardiographic assessments in patients with previously diagnosed pericardial effusions. ^
Resumo:
Coronary artery disease (CAD) is the most common cause of morbidity and mortality in the United States. While Coronary Angiography (CA) is the gold standard test to investigate coronary artery disease, Prospective gated-64 Slice Computed Tomography (Prosp-64CT) is a new non-invasive technology that uses the 64Slice computed tomography (64CT) with electrocardiographic gating to investigate coronary artery disease. The aim of the current study was to investigate the role of Body Mass Index (BMI) as a factor affecting occurrence of CA after a Prosp-64CT, as well as the quality of the Prosp-64CT. Demographic and clinical characteristics of the study population were described. A secondary analysis of data on patients who underwent a Prosp-64CT for evaluation of coronary artery disease was performed. Seventy seven patients who underwent Prosp-64CT for evaluation for coronary artery disease were included. Fifteen patients were excluded because they had missing data regarding BMI, quality of the Prosp-64CT or CA. Thus, a total of 62 patients were included in the final analysis. The mean age was 56.2 years. The mean BMI was 31.3 kg/m 2. Eight (13%) patients underwent a CA within one month of Prosp-64CT. Eight (13%) patients had a poor quality Prosp-64CT. There was significant association of higher BMI as a factor for occurrence of CA post Prosp-64CT (P<0.05). There was a trend, but no statistical significance was observed for the association of being obese and occurrence of CA (P=0.06). BMI, as well as obesity, were not found to be significantly associated with poor quality of Prosp-64CT (P=0.19 and P=0.76, respectively). In conclusion, BMI was significantly associated with occurrence of CA within one month of Prosp-64CT. Thus, in patients with a higher BMI, diagnostic investigation with both tests could be avoided; rather, only a CA could be performed. However, the relationship of BMI to quality of Prosp-64CT needs to be further investigated since the sample size of the current study was small.^
Resumo:
Introduction: HIV-associated malignancies such as Kaposi’s sarcoma and Non-Hodgkin’s lymphoma occur in children and usually lead to significant morbidity and mortality. No studies have been done to establish prevalence and outcome of these malignancies in children in a hospital setting in Uganda. ^ Research question: What proportion of children attending the Baylor-Uganda COE present with HIV-associated malignancies and what are the characteristics and outcome of these malignancies? The objective was to determine the prevalence, associated factors and outcome of HIV-associated malignancies among children attending the Baylor-Uganda Clinic in Kampala, Uganda. Study Design: This was a retrospective case series involving records review of patients who presented to the Baylor-Clinic between January 2004 and December 2008. Study Setting: The Baylor-Uganda Clinic, where I worked as a physician before coming to Houston, is a well funded, well staffed; Pediatric HIV clinic located in Mulago Hospital, Kampala, Uganda and is affiliated to Makerere University Medical School. Study Participants: Medical charts of patients aged 6 weeks to 18 years who enrolled for care at the clinic during the years 2004 to 2008 were retrieved for data abstraction. Selection Criteria: Study participants had to be patients of Baylor-Uganda seen during the study period; they had to be aged 6 weeks to 18 years; and had to be HIV positive. Patients with incomplete data or whose malignancies were not confirmed by histology were excluded. Study Variables: Data on patient’s age, sex, diagnosis, type of malignancy, anatomic location of the malignancy; pathology report, baseline laboratory results and outcome of treatment, were abstracted. Data Analysis: Cross tabulation to determine associations between variables using Pearson’s chi square at 95% level of significance was done. Proportions of malignancies among different groups were determined. In addition, Kaplan Meier survival analysis and comparison of survival distributions using the log-rank test was done. Change in CD4 percentages from baseline was assessed with the Wilcoxon signed rank test. Results: The proportion of children with malignancies during the study period was found to be 1.65%. Only 2 malignancies: Kaposi’s sarcoma and Non-Hodgkin’s lymphoma were found. 90% of the malignancies were Kaposi’s sarcoma. Lymph node involvement in children with Kaposi’s sarcoma was common, but the worst prognosis was seen with visceral involvement. Deaths during follow-up were seen in the first few weeks to months. Upon starting treatment the CD4 cell percentage increased significantly from a baseline median of 6% to 14% at 6 months and 15.8% at 12 months of follow-up.^
Resumo:
The purpose of this study was to determine if race/ethnicity was a significant risk factor for hospital mortality in children following congenital heart surgery in a contemporary sample of newborns with congenital heart disease. Unlike previous studies that utilized administrative databases, this study utilized clinical data collected at the point of care to examine racial/ethnic outcome differences in the context of the patients' clinical condition and their overall perioperative experience. A retrospective cohort design was used. The study sample consisted of 316 newborns (<31 days of age) who underwent congenital heart surgery between January 2007 through December 2009. A multivariate logistic regression model was used to determine the impact of race/ethnicity, insurance status, presence of a spatial anomaly, prenatal diagnosis, postoperative sepsis, cardiac arrest, respiratory failure, unplanned reoperation, and total length of stay in the intensive care unit on outcomes following congenital heart surgery in newborns. The study findings showed that the strongest predictors of hospital mortality following congenital heart surgery in this cohort were postoperative cardiac arrest, postoperative respiratory failure, having a spatial anomaly, and total ICU LOS. Race/ethnicity and insurance status were not significant risk factors. The institution where this study was conducted is designated as a center of excellence for congenital heart disease. These centers have state-of-the-art facilities, extensive experience in caring for children with congenital heart disease, and superior outcomes. This study suggests that optimal care delivery for newborns requiring congenital heart surgery at a center of excellence portends exceptional outcomes and this benefit is conferred upon the entire patient population despite the race/ethnicity of the patients. From a public health and health services view, this study also contributes to the overall body of knowledge on racial/ethnic disparities in children with congenital heart defects and puts forward the possibility of a relationship between quality of care and racial/ethnic disparities. Further study is required to examine the impact of race/ethnicity on the long-term outcomes of these children as they encounter the disparate components of the health care delivery system. There is also opportunity to study the role of race/ethnicity on the hospital morbidity in these patients considering current expectations for hospital survival are very high, and much of the current focus for quality improvement rests in minimizing the development of patient morbidities.^
Resumo:
The study was carried out at St. Luke's Episcopal Hospital to evaluate environmental contamination of Clostridium difficile in the infected patient rooms. Samples were collected from the high risk areas and were immediately cultured for the presence of Clostridium difficile . Lack of microbial typing prevented the study of molecular characterization of the Clostridium difficile isolates obtained led to a change in the study hypothesis. The study found a positivity of 10% among 50 Hospital rooms sampled for the presence of Clostridium difficile. The study provided data that led to recommendations that routine environmental sampling be carried in the hospital rooms in which patients with CDAD are housed and that effective environmental disinfection methods are used. The study also recommended molecular typing methods to allow characterization of the CD strains isolated from patients and environmental sampling to determine their type, similarity and origin.^
Resumo:
Respiratory Syncytial Virus (RSV) is a major cause of respiratory tract infections in immunocompromised patients such as children less than 2 years, premature infants with congenital heart disease and chronic lung disease, elderly patients and patients who have undergone hematopoietic stem cell transplant (HSCT). HSCT patients are at high risk of RSV infection, at increased risk of developing pneumonia, and RSV-related mortality. Immunodeficiency can be a major risk factor for severe infection & mortality. Therapy of RSV infection with Ribavirin, Palivizumab and Immunoglobulin has shown to reduce the risk of progression to LRI and mortality, especially if initiated early in the disease. Data on RSV infection in HSCT patients is limited, especially at various levels of immunodeficiency. 323 RSV infections in HSCT patients have been identified between 1/1995 and 8/2009 at University of Texas M D Anderson Cancer Center (UTMDACC). In this proposed study, we attempted to analyze a de-identified database of these cases and describe the epidemiologic characteristics of RSV infection in HSCT patients, the course of the infection, rate of development of pneumonia and RSV-related mortality in HSCT patients at UTMDACC.^ Key words: RSV infections, HSCT patients ^
Resumo:
Undiagnosed infected mothers often are the source of pertussis illness in young infants. The Centers for Disease Control and Prevention (CDC) recommends Tdap vaccine for post-partum women before hospital discharge. This intervention has been implemented at Ben Taub General Hospital (BTGH) in Houston, TX since January 2008. Our objective was to compare the proportion of infants born at BTGH and developing pertussis to the total number of pertussis cases before and after the intervention. Methods. We conducted a cross-sectional comparative study between the pre-intervention (7/2000 to 12/2007) and post-intervention (1/2008 to 5/2009) periods. Information on pertussis diagnosis was determined using ICD-9 codes, infection control records, and molecular microbiology reports from Texas Children's Hospital (TCH) and BTGH. Only patients ≤ 6 months of age with laboratory-confirmed B. pertussis infection were included in the study. Results. 481 infants had pertussis illness; 353 (73.3%) during pre-intervention and 128 (26.6%) during post-intervention years. The groups were comparable in all measures including age (median 73 vs. 62.5 days; p=0.08), gender (males 54.2%; p=0.47), length of hospitalization (median 9.8 vs. 4 9.5 days; p=0.5), outcomes (2 deaths in each period; p=0.28) and pertussis illness at TCH (95.2% vs. 95.3%; p=0.9). The proportion of pertussis patients born at BTGH, and thus amenable to protection by the intervention, was not statically different between the two periods after adjusting for age, gender and ethnicity (7.3% vs. 9.3%; an OR=1.05, 95% CI 0.5-2.1, p=0.88). Conclusions. Vaccinating only mothers with Tdap in the post-partum period does not reduce the proportion of pertussis in infants age ≤ 6 months. Efforts should be directed at Tdap immunization of not only mothers, but also all household and key contacts of newborns to protect them against pertussis illness before the primary DTaP series is completed.^
Resumo:
Identifying accurate numbers of soldiers determined to be medically not ready after completing soldier readiness processing may help inform Army leadership about ongoing pressures on the military involved in long conflict with regular deployment. In Army soldiers screened using the SRP checklist for deployment, what is the prevalence of soldiers determined to be medically not ready? Study group. 15,289 soldiers screened at all 25 Army deployment platform sites with the eSRP checklist over a 4-month period (June 20, 2009 to October 20, 2009). The data included for analysis included age, rank, component, gender and final deployment medical readiness status from MEDPROS database. Methods.^ This information was compiled and univariate analysis using chi-square was conducted for each of the key variables by medical readiness status. Results. Descriptive epidemiology Of the total sample 1548 (9.7%) were female and 14319 (90.2%) were male. Enlisted soldiers made up 13,543 (88.6%) of the sample and officers 1,746 (11.4%). In the sample, 1533 (10.0%) were soldiers over the age of 40 and 13756 (90.0%) were age 18-40. Reserve, National Guard and Active Duty made up 1,931 (12.6%), 2,942 (19.2%) and 10,416 (68.1%) respectively. Univariate analysis. Overall 1226 (8.0%) of the soldiers screened were determined to be medically not ready for deployment. Biggest predictive factor was female gender OR (2.8; 2.57-3.28) p<0.001. Followed by enlisted rank OR (2.01; 1.60-2.53) p<0.001. Reserve component OR (1.33; 1.16-1.53) p<0.001 and Guard OR (0.37; 0.30-0.46) p<0.001. For age > 40 demonstrated OR (1.2; 1.09-1.50) p<0.003. Overall the results underscore there may be key demographic groups relating to medical readiness that can be targeted with programs and funding to improve overall military medical readiness.^
Resumo:
Purpose. To evaluate the use of the Legionella Urine Antigen Test as a cost effective method for diagnosing Legionnaires’ disease in five San Antonio Hospitals from January 2007 to December 2009. ^ Methods. The data reported by five San Antonio hospitals to the San Antonio Metropolitan Health District during a 3-year retrospective study (January 2007 to December 2009) were evaluated for the frequency of non-specific pneumonia infections, the number of Legionella Urine Antigen Tests performed, and the percentage of positive cases of Legionnaires’ disease diagnosed by the Legionella Urine Antigen Test.^ Results. There were a total of 7,087 cases of non-specific pneumonias reported across the five San Antonio hospitals studied from 2007 to 2009. A total of 5,371 Legionella Urine Antigen Tests were performed from January, 2007 to December, 2009 across the five San Antonio hospitals in the study. A total of 38 positive cases of Legionnaires’ disease were identified by the use of Legionella Urinary Antigen Test from 2007-2009.^ Conclusions. In spite of the limitations of this study in obtaining sufficient relevant data to evaluate the cost effectiveness of Legionella Urinary Antigen Test in diagnosing Legionnaires’ disease, the Legionella Urinary Antigen Test is simple, accurate, faster, as results can be obtained within minutes to hours; and convenient because it can be performed in emergency room department to any patient who presents with the clinical signs or symptoms of pneumonia. Over the long run, it remains to be shown if this test may decrease mortality, lower total medical costs by decreasing the number of broad-spectrum antibiotics prescribed, shorten patient wait time/hospital stay, and decrease the need for unnecessary ancillary testing, and improve overall patient outcomes.^
Resumo:
Racial differences in heart failure with preserved ejection fraction (HFpEF) have rarely been studied in an ambulatory, financially "equal access" cohort, although the majority of such patients are treated as outpatients. ^ Retrospective data was collected from 2,526 patients (2,240 Whites, 286 African American) with HFpEF treated at 153 VA clinics, as part of the VA External Peer Review Program (EPRP) between October 2000 and September 2002. Kaplan Meier curves (stratified by race) were created for time to first heart failure (HF) hospitalization, all cause hospitalization and death and Cox proportional multivariate regression models were constructed to evaluate the effect of race on these outcomes. ^ African American patients were younger (67.7 ± 11.3 vs. 71.2 ± 9.8 years; p < 0.001), had lower prevalence of atrial fibrillation (24.5 % vs. 37%; p <0.001), chronic obstructive pulmonary disease (23.4 % vs. 36.9%, p <0.001), but had higher blood pressure (systolic blood pressure > 120 mm Hg 77.6% vs. 67.8%; p < 0.01), glomerular filtration rate (67.9 ± 31.0 vs. 61.6 ± 22.6 mL/min/1.73 m2; p < 0.001), anemia (56.6% vs. 41.7%; p <0.001) as compared to whites. African Americans were found to have higher risk adjusted rate of HF hospitalization (HR 1.52, 95% CI 1.1 - 2.11; p = 0.01), with no difference in risk-adjusted all cause hospitalization (p = 0.80) and death (p= 0.21). ^ In a financially "equal access" setting of the VA, among ambulatory patients with HFpEF, African Americans have similar rates of mortality and all cause hospitalization but have an increased risk of HF hospitalizations compared to whites.^
Resumo:
This study was designed to investigate the effect of calcium and fluoride intake, and parity and lactation on the risk of spinal osteoporosis. Height loss was used as a surrogate measure for spinal fractures by taking advantage of documented changes in height found during the 25-year follow-up of the Charleston Heart Study cohort. Women who had lost 2-4" in height or who had no change in height during the follow-up period were defined as case and comparison subjects respectively. Calcium intake when the subjects were "about 25" and in the recent past, average intake of fluoride over 25 years, and parity and history of breastfeeding were ascertained by questionnaire from 54 case and 77 comparison subjects. Low calcium intake in the past decreased the risk of height loss (age-adjusted OR = 0.3, 95%CI: 0.1-0.96) although several potentially important confounding variables could not be adjusted for. There was no association between risk of height loss and present calcium intake (OR = 0.8, 95%CI: 0.3-2.6 for low versus high intake) after adjustment for past calcium intake. High fluoride intake decreased the risk of height loss (adjusted OR = 0.4, 95%CI: 0.1-1.2). The effect of fluoride or calcium intake in the present was modified by the level of the other nutrient. Compared to a low intake of both calcium and fluoride, a high intake of one increased the risk of height loss (crude OR = 3.3 for high fluoride/low calcium, crude OR = 6.0 for high calcium/low fluoride) although a high intake of both was slightly protective (crude OR = 0.7). It is estimated that a "high" nutrient intake in this population was greater than 850mg/day for calcium and 2mg/day for fluoride. After adjustment for age, increasing parity decreased the risk of height loss in women who had never breastfed (OR = 0.2, 95%CI: 0.01-1.7 for 4 or more children). Women who had breastfed were also at lower risk of height loss than nulliparous women (OR = 0.3, 95%CI: 0.1-1.2 for 4 or more children) although at any level of parity, breastfeeding women had a greater risk of height loss than did non-breastfeeding women. ^