136 resultados para predictors
Resumo:
Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)
Resumo:
DNA-hsp65, a DNA vaccine encoding the 65-kDa heat-shock protein of Mycobacterium leprae (Hsp65) is capable of inducing the reduction of established tumors in mouse models. We conducted a phase I clinical trial of DNA-hsp65 in patients with advanced head and neck carcinoma. In this article, we report on the vaccine`s potential to induce immune responses to Hsp65 and to its human homologue, Hsp60, in these patients. Twenty-one patients with unresectable squamous cell carcinoma of the head and neck received three doses of 150, 400 or 600 mu g naked DNA-hsp65 plasmid by ultrasound-guided intratumoral injection. Vaccination did not increase levels of circulating anti-hsp65 IgG or IgM antibody, or lead to detectable Hsp65-specific cell proliferation or interferon-gamma (IFN-gamma) production by blood mononuclear cells. Frequency of antigen-induced IL-10-producing cells increased after vaccination in 4 of 13 patients analyzed. Five patients showed disease stability or regression following immunization; however, we were unable to detect significant differences between these patients and those with disease progression using these parameters. There was also no increase in antibody or IFN-gamma responses to human Hsp60 in these patients. Our results suggest that although DNA-hsp65 was able to induce some degree of immunostimulation with no evidence of pathological autoimmunity, we were unable to differentiate between patients with different clinical outcomes based on the parameters measured. Future studies should focus on characterizing more reliable correlations between immune response parameters and clinical outcome that may be used as predictors of vaccine success in immunosuppressed individuals. Cancer Gene Therapy (2009) 16, 598-608; doi:10.1038/cgt.2009.9; published online 6 February 2009
Resumo:
Background: Vascular calcification is common and constitutes a prognostic marker of mortality in the hemodialysis population. Derangements of mineral metabolism may influence its development. The aim of this study is to prospectively evaluate the association between bone remodeling disorders and progression of coronary artery calcification (CAC) in hemodialysis patients. Study Design: Cohort study nested within a randomized controlled trial. Setting & Participants: 64 stable hemodialysis patients. Predictor: Bone-related laboratory parameters and bone histomorphometric characteristics at baseline and after 1 year of follow-up. Outcomes: Progression of CAC assessed by means of coronary multislice tomography at baseline and after 1 year of follow-up. Baseline calcification score of 30 Agatston units or greater was defined as calcification. Change in calcification score of 15% or greater was defined as progression. Results: Of 64 patients, 26 (40%) had CAC at baseline and 38 (60%) did not. Participants without CAC at baseline were younger (P < 0.001), mainly men (P = 0.03) and nonwhite (P = 0.003), and had lower serum osteoprotegerin levels (P = 0.003) and higher trabecular bone volume (P = 0.001). Age (P 0.003; beta coefficient = 1.107; 95% confidence interval [Cl], 1.036 to 1.183) and trabecular bone volume (P = 0.006; beta coefficient = 0.828; 95% Cl, 0.723 to 0.948) were predictors for CAC development. Of 38 participants who had calcification at baseline, 26 (68%) had CAC progression in 1 year. Progressors had lower bone-specific alkaline phosphatase (P = 0.03) and deoxypyridinoline levels (P = 0.02) on follow-up, and low turnover was mainly diagnosed at the 12-month bone biopsy (P = 0.04). Low-turnover bone status at the 12-month bone biopsy was the only independent predictor for CAC progression (P = 0.04; beta coefficient = 4.5; 95% Cl, 1.04 to 19.39). According to bone histological examination, nonprogressors with initially high turnover (n = 5) subsequently had decreased bone formation rate (P = 0.03), and those initially with low turnover (n = 7) subsequently had increased bone formation rate (P = 0.003) and osteoid volume (P = 0.001). Limitations: Relatively small population, absence of patients with severe hyperparathyroidism, short observational period. Conclusions: Lower trabecular bone volume was associated with CAC development, whereas improvement in bone turnover was associated with lower CAC progression in patients with high- and low-turnover bone disorders. Because CAC is implicated in cardiovascular mortality, bone derangements may constitute a modifiable mortality risk factor in hemodialysis patients.
Resumo:
Although a new protocol of dobutamine stress echocardiography with the early injection of atropine (EA-DSE) has been demonstrated to be useful in reducing adverse effects and increasing the number of effective tests and to have similar accuracy for detecting coronary artery disease (CAD) compared with conventional protocols, no data exist regarding its ability to predict long-term events. The aim of this study was to determine the prognostic value of EA-DSE and the effects of the long-term use of beta blockers on it. A retrospective evaluation of 844 patients who underwent EA-DSE for known or suspected CAD was performed; 309 (37%) were receiving beta blockers. During a median follow-up period of 24 months, 102 events (12%) occurred. On univariate analysis, predictors of events were the ejection fraction (p <0.001), male gender (p <0.001), previous myocardial infarction (p <0.001), angiotensin-converting enzyme inhibitor therapy (p = 0.021), calcium channel blocker therapy (p = 0.034), and abnormal results on EA-DSE (p <0.001). On multivariate analysis, the independent predictors of events were male gender (relative risk [RR] 1.78, 95% confidence interval [CI] 1.13 to 2.81, p = 0.013) and abnormal results on EA-DSE (RR 4.45, 95% CI 2.84 to 7.01, p <0.0001). Normal results on EA-DSE with P blockers were associated with a nonsignificant higher incidence of events than normal results on EA-DSE without beta blockers (RR 1.29, 95% CI 0.58 to 2.87, p = 0.54). Abnormal results on EA-DSE with beta blockers had an RR of 4.97 (95% CI 2.79 to 8.87, p <0.001) compared with normal results, while abnormal results on EA-DSE without beta blockers had an RR of 5.96 (95% CI 3.41 to 10.44, p <0.001) for events, with no difference between groups (p = 0.36). In conclusion, the detection of fixed or inducible wall motion abnormalities during EA-DSE was an independent predictor of long-term events in patients with known or suspected CAD. The prognostic value of EA-DSE was not affected by the long-term use of beta blockers. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1291-1295)
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Aims: There remains significant concern about the long-term safety of drug-eluting stents (DES). However, bare metal stents (BMS) have been used safely for over two decades. There is therefore a pressing need to explore alternative strategies for reducing restenosis with BMS. This study was designed to examine whether IVUS-guided cutting balloon angioplasty (CBA) with BMS could convey similar restenosis rates to DES. Methods and results: In the randomised REstenosis reDUction by Cutting balloon angioplasty Evaluation (REDUCE III) study, 521 patients were divided into four groups based on device and IVUS use before BMS (IVUS-CBA-BMS: 137 patients; Angio-CBA-BMS: 123; IVUS-BA-BMS: 142; and Angio-BA-BMS: 119). At follow-up, the IVUS-CBA-BMS group had a significantly lower restenosis rate (6.6%) than the other groups (p=0.016). We performed a quantitative coronary angiography (QCA) based matched comparison between an IVUS-guided CBA-BMS strategy (REDUCE III) and a DES strategy (Rapamycin-Eluting-Stent Evaluation At Rotterdam Cardiology. Hospital, the RESEARCH study). We matched the presence of diabetes, vessel size, and lesion severity by QCA. Restenosis (>50% diameter stenosis at follow-up) and target vessel revascularisation (TVR) were examined. QCA-matched comparison resulted in 120-paired lesions. While acute gain was significantly greater in IVUS-CBA-BMS than DES (1.65 +/- 0.41 mm vs. 1.28 +/- 0.57 mm, p=0.001), late loss was significantly less with DES than with IVUS-CBA-BMS (0.03 +/- 0.42 mm vs. 0.80 +/- 0.47 mm, p=0.001). However, no difference was found in restenosis rates (IVUS-CBA-BMS: 6.6% vs. DES: 5.0%, p=0.582) and TVR (6.6% and 6.6%, respectively). Conclusions: An IVUS-guided CBA-BMS strategy yielded restenosis rates similar to those achieved by DES and provided an effective alternative to the use of DES.
Resumo:
Objective: to determine the relationship between age and in-hospital mortality of elderly patients, admitted to ICU, requiring and not requiring invasive ventilatory support. Design: prospective observational cohort study conducted over a period of 11 months. Setting: medical-surgical ICU at a Brazilian university hospital. Subjects: a total of 840 patients aged 55 years and older were admitted to ICU. Methods: in-hospital death rates for patients requiring and not requiring invasive ventilatory support were compared across three successive age intervals (55-64; 65-74 and 75 or more years), adjusting for severity of illness using the Acute Physiologic Score. Results: age was strongly correlated with mortality among the invasively ventilated subgroup of patients and the multivariate adjusted odds ratios increased progressively with every age increment (OR = 1.60, 95% CI = 1.01-2.54 for 65-74 years old and OR = 2.68, 95% CI = 1.58-4.56 for >= 75 years). For the patients not submitted to invasive ventilatory support, age was not independently associated with in-hospital mortality (OR = 2.28, 95% CI = 0.99-5.25 for 65-74 years old and OR = 1.95, 95% CI = 0.82-4.62 for >= 75 years old). Conclusions: the combination of age and invasive mechanical ventilation is strongly associated with in-hospital mortality. Age should not be considered as a factor related to in-hospital mortality of elderly patients not requiring invasive ventilatory support in ICU.
Resumo:
Abnormal heart-rate (HR) response during or after a graded exercise test has been recognized as a strong and an independent predictor of all-cause mortality in healthy and diseased subjects. The purpose of the present study was to evaluate the HR response during exercise in women with systemic lupus erythematosus (SLE). In this case-control study, 22 women with SLE (age 29.5 perpendicular to 1.1 years) were compared with 20 gender-, BMI-, and age-matched healthy subjects (age 26.5 +/- 1.4 years). A treadmill cardiorespiratory test was performed and HR response during exercise was evaluated by the chronotropic reserve (CR). HR recovery (Delta HRR) was defined as the difference between HR at peak exercise and at both first (Delta HRR1) and second (Delta HRR2) minutes after exercising. SLE patients presented lower peak VO(2) when compared with healthy subjects (27.6 perpendicular to 0.9 vs. 36.7 perpendicular to 1.1 ml/kg/min, p = 0.001, respectively). Additionally, SLE patients demonstrated lower CR (71.8 +/- 2.4 vs. 98.2 +/- 2.6%, p = 0.001), Delta HRR1 (22.1 +/- 2.5 vs. 32.4 +/- 2.2%, p = 0.004) and Delta HRR2 (39.1 +/- 2.9 vs. 50.8 +/- 2.5%, p = 0.001) than their healthy peers. In conclusion, SLE patients presented abnormal HR response to exercise, characterized by chronotropic incompetence and delayed Delta HRR. Lupus (2011) 20, 717-720.
Resumo:
Background. The incidence of unexplained sudden death (SD) and the factors involved in its occurrence in patients with chronic kidney disease are not well known. Methods. We investigated the incidence and the role of co-morbidities in unexplained SD in 1139 haemodialysis patients on the renal transplant waiting list. Results. Forty-four patients died from SD of undetermined causes (20% of all deaths; 3.9 deaths/1000 patients per year), while 178 died from other causes and 917 survived. SD patients were older and likely to have diabetes, hypertension, past/present cardiovascular disease, higher left ventricular mass index, and lower ejection fraction. Multivariate analysis showed that cardiovascular disease of any type was the only independent predictor of SD (P = 0.0001, HR = 2.13, 95% CI 1.46-3.22). Alterations closely associated with ischaemic heart disease like angina, previous myocardial infarction and altered myocardial scan were not independent predictors of SD. The incidence of unexplained SD in these haemodialysis patients is high and probably a consequence of pre-existing cardiovascular disease. Conclusions. Factors influencing SD in dialysis patients are not substantially different from factors in the general population. The role played by ischaemic heart disease in this context needs further evaluation.
Resumo:
Objective The objective of the study was to investigate whether depression is a predictor of postdischarge smoking relapse among patients hospitalized for myocardial infarction (MI) or unstable angina (ILIA), in a smoke-free hospital. Methods Current smokers with MI or UA were interviewed while hospitalized; patients classified with major depression (MD) or no humor disorder were reinterviewed 6 months post discharge to ascertain smoking status. Potential predictors of relapse (depression; stress; anxiety; heart disease risk perception; coffee and alcohol consumption; sociodemographic, clinical, and smoking habit characteristics) were compared between those with MD (n = 268) and no humor disorder (n = 135). Results Relapsers (40.4%) were more frequently and more severely depressed, had higher anxiety and lower self-efficacy scale scores, diagnosis of UA, shorter hospitalizations, started smoking younger, made fewer attempts to quit, had a consort less often, and were more frequently at the `precontemplation` stage of change. Multivariate analysis showed relapse-positive predictors to be MD [odds ratio (OR): 2.549; 95% confidence interval (CI): 1.519-4.275] (P<0.001); `precontemplation` stage of change (OR: 7.798; 95% CI: 2.442-24.898) (P<0.001); previous coronary bypass graft surgery (OR: 4.062; 95% CI: 1.356-12.169) (P=0.012); and previous anxiolytic use (OR: 2.365; 95% CI: 1.095-5.107) (P=0.028). Negative predictors were diagnosis of MI (OR: 0.575; 95% CI: 0.361-0.916) (P=0.019); duration of hospitalization (OR: 0.935; 95% CI: 0.898-0.973) (P=0.001); smoking onset age (OR: 0.952; 95% CI: 0.910-0.994) (P=0.028); number of attempts to quit smoking (OR: 0.808; 95% CI: 0.678-0.964) (P=0.018); and `action` stage of change (OR: 0.065; 95% CI: 0.008-0.532) (P= 0.010). Conclusion Depression, no motivation, shorter hospitalization, and severity of illness contributed to postdischarge resumption of smoking by patients with acute coronary syndrome, who underwent hospital-initiated smoking cessation.
Resumo:
Objectives: To evaluate clinical and echocardiographic variables that could be used to predict outcomes in patients with asymptomatic severe aortic valve stenosis. Management of asymptomatic severe aortic stenosis is controversial. Because prophylactic surgery may be protective, independent predictors of events that could justify early surgery have been sought. Methods: Outpatients (n= 133; mean [+/- SD] age, 66.2 +/- 13.6 years) with isolated severe asymptomatic aortic stenosis but normal left ventricular function and no previous myocardial infarction were followed up prospectively at a tertiary care hospital. Interventions: We use a ""wait-for-events"" strategy. Clinical and echocardiographic variables were analyzed. Results: Nineteen patients developed angina; 40, dyspnea; 5, syncope; and 7, sudden death during a mean follow-up period of 3.30 +/- 1.87 years. Event-free survival was 90.2 +/- 2.6% at 1 year, 73.4 +/-.9% at 2 years, 70.7 +/- 4.3% at 3 years, 57.8 +/- 4.7% at 4 years, 40.3 +/- 5.0% at 5 years, and 33.3 +/- 5.2% at 6 years. The mean follow-up period until sudden death (1.32 +/- 1.11 years) was shorter than that for dyspnea (2.44 +/- 1.84 years), syncope (2.87 +/- 1.26 years) and angina (3.03 +/- 1.68 years). Cox regression analysis disclosed only reduced but within normal limits ejection fraction as independent predictor of total events. Conclusions: Management on ""wait-for-events"" strategy is generally safe. Progressive left ventricular ejection fraction reduction even within normal limits identified patients at high risk for events in whom valve replacement surgery should be considered. (c) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background We validated a strategy for diagnosis of coronary artery disease ( CAD) and prediction of cardiac events in high-risk renal transplant candidates ( at least one of the following: age >= 50 years, diabetes, cardiovascular disease). Methods A diagnosis and risk assessment strategy was used in 228 renal transplant candidates to validate an algorithm. Patients underwent dipyridamole myocardial stress testing and coronary angiography and were followed up until death, renal transplantation, or cardiac events. Results The prevalence of CAD was 47%. Stress testing did not detect significant CAD in 1/3 of patients. The sensitivity, specificity, and positive and negative predictive values of the stress test for detecting CAD were 70, 74, 69, and 71%, respectively. CAD, defined by angiography, was associated with increased probability of cardiac events [log-rank: 0.001; hazard ratio: 1.90, 95% confidence interval (CI): 1.29-2.92]. Diabetes (P=0.03; hazard ratio: 1.58, 95% CI: 1.06-2.45) and angiographically defined CAD (P=0.03; hazard ratio: 1.69, 95% CI: 1.08-2.78) were the independent predictors of events. Conclusion The results validate our observations in a smaller number of high-risk transplant candidates and indicate that stress testing is not appropriate for the diagnosis of CAD or prediction of cardiac events in this group of patients. Coronary angiography was correlated with events but, because less than 50% of patients had significant disease, it seems premature to recommend the test to all high-risk renal transplant candidates. The results suggest that angiography is necessary in many high-risk renal transplant candidates and that better noninvasive methods are still lacking to identify with precision patients who will benefit from invasive procedures. Coron Artery Dis 21: 164-167 (C) 2010 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
Resumo:
Left ventricular hypertrophy is an important predictor of cardiovascular risk and sudden death. This study explored the ability of four obesity indexes (body mass index, waist circumference, waist-hip ratio and waist-stature ratio) to identify left ventricular hypertrophy. A sample of the general population (n=682; 43.5% men) was surveyed to assess cardiovascular risk factors. Biochemical, anthropometric and blood pressure values were obtained in a clinic visit according to standard methods. Left ventricular mass was obtained from transthoracic echocardiogram. Left ventricular hypertrophy was defined using population-specific cutoff values for left ventricular mass indexed to height(2.7). The waist-stature ratio showed the strongest positive association with left ventricular mass. This correlation was stronger in women, even after controlling for age and systolic blood pressure. By multivariate analysis, the main predictors of left ventricular hypertrophy were waist-stature ratio (23%), systolic blood pressure (9%) and age (2%) in men, and waist-stature ratio (40%), age (6%) and systolic blood pressure (2%) in women. Receiver-operating characteristic curves showed the optimal cutoff values of the different anthropometric indexes associated with left ventricular hypertrophy. The waist-stature ratio was a significantly better predictor than the other indexes (except for the waist-hip ratio), independent of gender. It is noteworthy that a waist-stature ratio cutoff of 0.56 showed the highest combined sensitivity and specificity to detect left ventricular hypertrophy. Abdominal obesity identified by waist-stature ratio instead of overall obesity identified by body mass index is the simplest and best obesity index for assessing the risk of left ventricular hypertrophy, is a better predictor in women and has an optimal cutoff ratio of 0.56. Hypertension Research (2010) 33, 83-87; doi: 10.1038/hr.2009.188; published online 13 November 2009
Resumo:
Background-Peculiar aspects of Chagas cardiomyopathy raise concerns about efficacy and safety of sympathetic blockade. We studied the influence of beta-blockers in patients with Chagas cardiomyopathy. Methods and Results-We examined REMADHE trial and grouped patients according to etiology (Chagas versus non-Chagas) and beta-blocker therapy. Primary end point was all-cause mortality or heart transplantation. Altogether 456 patients were studied; 27 (5.9%) were submitted to heart transplantation and 202 (44.3%) died. Chagas etiology was present in 68 (14.9%) patients; they had lower body mass index (24.1+/-4.1 versus 26.3+/-5.1, P=0.001), smaller end-diastolic left ventricle diameter (6.7+/-1.0 mm versus 7.0+/-0.9 mm, P=0.001), smaller proportion of beta-blocker therapy (35.8% versus 68%, P<0.001), and higher proportion of spironolactone therapy (74.6% versus 57.8%, P=0.003). Twenty-four (35.8%) patients with Chagas disease were under beta-blocker therapy and had lower serum sodium (136.6+/-3.1 versus 138.4+/-3.1 mEqs, P=0.05) and lower body mass index (22.5+/-3.3 versus 24.9+/-4.3, P=0.03) compared with those who received beta-blockers. Survival was lower in patients with Chagas heart disease as compared with other etiologies. When only patients under beta-blockers were considered, the survival of patients with Chagas disease was similar to that of other etiologies. The survival of patients with beta-blockers was higher than that of patients without beta-blockers. In Cox regression model, left ventricle end-diastolic diameter (hazard ratio, 1.78; CI, 1.15 to 2.76; P=0.009) and beta-blockers (hazard ratio, 0.37; CI, 0.14 to 0.97; P=0.044) were associated with better survival. Conclusions-Our study suggests that beta-blockers may have beneficial effects on survival of patients with heart failure and Chagas heart disease and warrants further investigation in a prospective, randomized trial.
Resumo:
Objective Cardiovascular risk factors were surveyed in two Indian populations (Guarani, n=60; Tupinikin, n=496) and in a non-Indian group (n=114) living in the same reserve in southeast Brazilian coast. The relationship between an age-dependent blood pressure (BP) increase with salt consumption was also investigated. Methods Overnight (12 h) urine was collected to evaluate Na excretion. Fasting glucose and lipids, anthropometry, BP, ECG and carotid-femoral pulse wave velocity (PWV) were measured in a clinic visit. Participation (318 men/352 women, age 20-94 years; mean=37.6 +/- 14.9 years) comprised 80% of the eligible population. Results The prevalence of hypertension, diabetes and high cholesterol was similar in Tupinikins and in non-Indians and higher than in Guaranis. The prevalence of smoking and obesity was higher in the latter group. Hypertension and diabetes were detected in only one individual of the Guarani group. Mean BP adjusted to age and BMI was significantly lower (P<0.01) in Guaranis (82.8 +/- 1.6 mmHg) than in Tupinikins (92.3 +/- 0.5 mmHg) and non-Indians (91.6 +/- 1.1 mmHg). Urinary Na excretion (mEq/12h), however, was similar in the three groups (Guarani=94 +/- 40; Tupinikin=105 +/- 56; non-Indian=109 +/- 55; P>0.05). PWV (m/s) was lower (P<0.01) in Guarani (7.5 +/- 1.4) than in Tupinikins (8.8 +/- 2.2) and non-Indians (8.4 +/- 2.0). Multiple regression analysis showed that age and waist-to-hip ratio (WHR) were independent predictors of SBP and DBP (r(2)=0.44) in Tupinikins, whereas the WHR was the unique independent predictor of BP variability in Guaranis (r(2)=0.22). Conclusion Lower BP levels in Guaranis cannot be explained by low salt intake observed in other primitive populations. J Hypertens 27:1753-1760 (C) 2009 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.