866 resultados para Predictors
Resumo:
Understanding the role of multiple colour signals during sexual signalling is a central theme in animal communication. We quantified the role of multiple colour signals (including ultraviolet, UV), measures of body size and testosterone levels in settling disputes between male rivals in an elaborately ornamented, African lizard, played out in a large 'tournament' in the wild. The hue and brightness (total reflectance) of the UV throat in Augrabies flat lizards, Platysaurus broadleyi, as well as body size, were consistent and strong predictors of 'fighting ability'. Males with high fighting ability were larger and displayed a UV throat with low total reflectance. In contrast, males with low fighting ability were smaller and had violet throats with broader spectral reflectance curves (higher total reflectance). As fighting ability is associated with alternative reproductive tactics in this system (territorial versus floater), we also examined the role of colour signals in predicting male reproductive tactic. Territorial males had UV throats with higher chroma but had poorer body condition than floater males, probably because of the energetic costs of maintaining a territory. Although testosterone was not a significant predictor of fighting ability or reproductive tactic, it was correlated with the hue of the UV throat, suggesting that testosterone may impose some constraint on signal expression. Lastly, we show that within the context of the natural signalling environment, UV-reflective throats constitute a conspicuous, effective signal that male Augrabies flat lizards use to advertise their status honestly to rivals. (c) 2006 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Aims: There remains significant concern about the long-term safety of drug-eluting stents (DES). However, bare metal stents (BMS) have been used safely for over two decades. There is therefore a pressing need to explore alternative strategies for reducing restenosis with BMS. This study was designed to examine whether IVUS-guided cutting balloon angioplasty (CBA) with BMS could convey similar restenosis rates to DES. Methods and results: In the randomised REstenosis reDUction by Cutting balloon angioplasty Evaluation (REDUCE III) study, 521 patients were divided into four groups based on device and IVUS use before BMS (IVUS-CBA-BMS: 137 patients; Angio-CBA-BMS: 123; IVUS-BA-BMS: 142; and Angio-BA-BMS: 119). At follow-up, the IVUS-CBA-BMS group had a significantly lower restenosis rate (6.6%) than the other groups (p=0.016). We performed a quantitative coronary angiography (QCA) based matched comparison between an IVUS-guided CBA-BMS strategy (REDUCE III) and a DES strategy (Rapamycin-Eluting-Stent Evaluation At Rotterdam Cardiology. Hospital, the RESEARCH study). We matched the presence of diabetes, vessel size, and lesion severity by QCA. Restenosis (>50% diameter stenosis at follow-up) and target vessel revascularisation (TVR) were examined. QCA-matched comparison resulted in 120-paired lesions. While acute gain was significantly greater in IVUS-CBA-BMS than DES (1.65 +/- 0.41 mm vs. 1.28 +/- 0.57 mm, p=0.001), late loss was significantly less with DES than with IVUS-CBA-BMS (0.03 +/- 0.42 mm vs. 0.80 +/- 0.47 mm, p=0.001). However, no difference was found in restenosis rates (IVUS-CBA-BMS: 6.6% vs. DES: 5.0%, p=0.582) and TVR (6.6% and 6.6%, respectively). Conclusions: An IVUS-guided CBA-BMS strategy yielded restenosis rates similar to those achieved by DES and provided an effective alternative to the use of DES.
Resumo:
Background. A sample of 1089 Australian adults was selected for the longitudinal component of the Quake Impact Study, a 2-year, four-phase investigation of the psychosocial effects of the 1989 Newcastle earthquake. Of these, 845 (78%) completed a survey 6 months post-disaster as well as one or more of the three follow-up surveys. Methods. The phase I survey was used to construct dimensional indices of self-reported exposure to threat the disruption and also to classify subjects by their membership of five 'at risk' groups (the injured; the displaced; owners of damaged small businesses; helpers in threat and non-threat situations). Psychological morbidity was assessed at each phase using the 12-item General Health Questionnaire (GHQ-12) and the Impact of Event Scale (IES). Results. Psychological morbidity declined over time but tended to stabilize at about 12 months post-disaster for general morbidity (GHQ-12) and at about 18 months for trauma-related (IES) morbidity. Initial exposure to threat and/or disruption were significant predictors of psychological morbidity throughout the study and had superior predictive power to membership of the targeted 'at risk' groups. The degree of ongoing disruption and other life events since the earthquake were also significant predictors of morbidity. The injured reported the highest levels of distress, but there was a relative absence of morbidity among the helpers. Conclusions. Future disaster research should carefully assess the threat and disruption experiences of the survivors at the time of the event and monitor ongoing disruptions in the aftermath in order to target interventions more effectively.
Resumo:
Background. This paper examines the contributions of dispositional and non-dispositional factors to post-disaster psychological morbidity. Data reported are from the 845 participants in the longitudinal component of the Quake Impact Study. Methods. The phase 1 survey was used to construct dimensional indices of threat and disruption exposure. Subsequently, a range of dispositional characteristics were measured, including neuroticism, personal hopefulness and defence style. The main morbidity measures were the General Health Questionnaire (GHQ-12) and Impact of Event Scale (IES). Results. Dispositional characteristics were the best predictors of psychological morbidity throughout the 2 years post-disaster, contributing substantially more to the variance in morbidity (12-39%) than did initial exposure (5-12%), but the extent of their contribution was greater for general (GHQ-12) than for post-traumatic (IES) morbidity. Among the non-dispositional factors, avoidance coping contributed equally to general and post-traumatic morbidity (pr = 0.24). Life events since the earthquake (pr = 0.18), poor social relationships (pr = -0.25) and ongoing earthquake-related disruptions (pr = 0.22) also contributed to general morbidity, while only the latter contributed significantly to post-traumatic morbidity (pr = 0.15). Conclusions. Medium-term post-earthquake morbidity appears to be a function of multiple factors whose contributions vary depending on the type of morbidity experienced and include trait vulnerability, the nature and degree of initial exposure, avoidance coping and the nature and severity of subsequent events.
Resumo:
Although planning is important for the functioning of patients with dementia of the Alzheimer Type (DAT), little is known about response programming in DAT. This study used a cueing paradigm coupled with quantitative kinematic analysis to document the preparation and execution of movements made by a group of 12 DAT patients and their age and sex matched controls. Participants connected a series of targets placed upon a WACOM SD420 graphics tablet, in response to the pattern of illumination of a set of light emitting diodes (LEDs). In one condition, participants could programme the upcoming movement, whilst in another they were forced to reprogramme this movement on-line (i.e. they were not provided with advance information about the location of the upcoming target). DAT patients were found to have programming deficits, taking longer to initiate movements; particularly in the absence of cues. While problems spontaneously programming a movement might cause a greater reliance upon on-line guidance, when both groups were required to guide the movement on-line, DAT patients continued to show slower and less efficient movements implying declining sensori-motor function; these differences were not simply due to strategy or medication status. (C) 1997 Elsevier Science Ltd.
Resumo:
Objective: to determine the relationship between age and in-hospital mortality of elderly patients, admitted to ICU, requiring and not requiring invasive ventilatory support. Design: prospective observational cohort study conducted over a period of 11 months. Setting: medical-surgical ICU at a Brazilian university hospital. Subjects: a total of 840 patients aged 55 years and older were admitted to ICU. Methods: in-hospital death rates for patients requiring and not requiring invasive ventilatory support were compared across three successive age intervals (55-64; 65-74 and 75 or more years), adjusting for severity of illness using the Acute Physiologic Score. Results: age was strongly correlated with mortality among the invasively ventilated subgroup of patients and the multivariate adjusted odds ratios increased progressively with every age increment (OR = 1.60, 95% CI = 1.01-2.54 for 65-74 years old and OR = 2.68, 95% CI = 1.58-4.56 for >= 75 years). For the patients not submitted to invasive ventilatory support, age was not independently associated with in-hospital mortality (OR = 2.28, 95% CI = 0.99-5.25 for 65-74 years old and OR = 1.95, 95% CI = 0.82-4.62 for >= 75 years old). Conclusions: the combination of age and invasive mechanical ventilation is strongly associated with in-hospital mortality. Age should not be considered as a factor related to in-hospital mortality of elderly patients not requiring invasive ventilatory support in ICU.
Resumo:
Abnormal heart-rate (HR) response during or after a graded exercise test has been recognized as a strong and an independent predictor of all-cause mortality in healthy and diseased subjects. The purpose of the present study was to evaluate the HR response during exercise in women with systemic lupus erythematosus (SLE). In this case-control study, 22 women with SLE (age 29.5 perpendicular to 1.1 years) were compared with 20 gender-, BMI-, and age-matched healthy subjects (age 26.5 +/- 1.4 years). A treadmill cardiorespiratory test was performed and HR response during exercise was evaluated by the chronotropic reserve (CR). HR recovery (Delta HRR) was defined as the difference between HR at peak exercise and at both first (Delta HRR1) and second (Delta HRR2) minutes after exercising. SLE patients presented lower peak VO(2) when compared with healthy subjects (27.6 perpendicular to 0.9 vs. 36.7 perpendicular to 1.1 ml/kg/min, p = 0.001, respectively). Additionally, SLE patients demonstrated lower CR (71.8 +/- 2.4 vs. 98.2 +/- 2.6%, p = 0.001), Delta HRR1 (22.1 +/- 2.5 vs. 32.4 +/- 2.2%, p = 0.004) and Delta HRR2 (39.1 +/- 2.9 vs. 50.8 +/- 2.5%, p = 0.001) than their healthy peers. In conclusion, SLE patients presented abnormal HR response to exercise, characterized by chronotropic incompetence and delayed Delta HRR. Lupus (2011) 20, 717-720.
Resumo:
Background. The incidence of unexplained sudden death (SD) and the factors involved in its occurrence in patients with chronic kidney disease are not well known. Methods. We investigated the incidence and the role of co-morbidities in unexplained SD in 1139 haemodialysis patients on the renal transplant waiting list. Results. Forty-four patients died from SD of undetermined causes (20% of all deaths; 3.9 deaths/1000 patients per year), while 178 died from other causes and 917 survived. SD patients were older and likely to have diabetes, hypertension, past/present cardiovascular disease, higher left ventricular mass index, and lower ejection fraction. Multivariate analysis showed that cardiovascular disease of any type was the only independent predictor of SD (P = 0.0001, HR = 2.13, 95% CI 1.46-3.22). Alterations closely associated with ischaemic heart disease like angina, previous myocardial infarction and altered myocardial scan were not independent predictors of SD. The incidence of unexplained SD in these haemodialysis patients is high and probably a consequence of pre-existing cardiovascular disease. Conclusions. Factors influencing SD in dialysis patients are not substantially different from factors in the general population. The role played by ischaemic heart disease in this context needs further evaluation.
Resumo:
Objective The objective of the study was to investigate whether depression is a predictor of postdischarge smoking relapse among patients hospitalized for myocardial infarction (MI) or unstable angina (ILIA), in a smoke-free hospital. Methods Current smokers with MI or UA were interviewed while hospitalized; patients classified with major depression (MD) or no humor disorder were reinterviewed 6 months post discharge to ascertain smoking status. Potential predictors of relapse (depression; stress; anxiety; heart disease risk perception; coffee and alcohol consumption; sociodemographic, clinical, and smoking habit characteristics) were compared between those with MD (n = 268) and no humor disorder (n = 135). Results Relapsers (40.4%) were more frequently and more severely depressed, had higher anxiety and lower self-efficacy scale scores, diagnosis of UA, shorter hospitalizations, started smoking younger, made fewer attempts to quit, had a consort less often, and were more frequently at the `precontemplation` stage of change. Multivariate analysis showed relapse-positive predictors to be MD [odds ratio (OR): 2.549; 95% confidence interval (CI): 1.519-4.275] (P<0.001); `precontemplation` stage of change (OR: 7.798; 95% CI: 2.442-24.898) (P<0.001); previous coronary bypass graft surgery (OR: 4.062; 95% CI: 1.356-12.169) (P=0.012); and previous anxiolytic use (OR: 2.365; 95% CI: 1.095-5.107) (P=0.028). Negative predictors were diagnosis of MI (OR: 0.575; 95% CI: 0.361-0.916) (P=0.019); duration of hospitalization (OR: 0.935; 95% CI: 0.898-0.973) (P=0.001); smoking onset age (OR: 0.952; 95% CI: 0.910-0.994) (P=0.028); number of attempts to quit smoking (OR: 0.808; 95% CI: 0.678-0.964) (P=0.018); and `action` stage of change (OR: 0.065; 95% CI: 0.008-0.532) (P= 0.010). Conclusion Depression, no motivation, shorter hospitalization, and severity of illness contributed to postdischarge resumption of smoking by patients with acute coronary syndrome, who underwent hospital-initiated smoking cessation.
Resumo:
Objectives: To evaluate clinical and echocardiographic variables that could be used to predict outcomes in patients with asymptomatic severe aortic valve stenosis. Management of asymptomatic severe aortic stenosis is controversial. Because prophylactic surgery may be protective, independent predictors of events that could justify early surgery have been sought. Methods: Outpatients (n= 133; mean [+/- SD] age, 66.2 +/- 13.6 years) with isolated severe asymptomatic aortic stenosis but normal left ventricular function and no previous myocardial infarction were followed up prospectively at a tertiary care hospital. Interventions: We use a ""wait-for-events"" strategy. Clinical and echocardiographic variables were analyzed. Results: Nineteen patients developed angina; 40, dyspnea; 5, syncope; and 7, sudden death during a mean follow-up period of 3.30 +/- 1.87 years. Event-free survival was 90.2 +/- 2.6% at 1 year, 73.4 +/-.9% at 2 years, 70.7 +/- 4.3% at 3 years, 57.8 +/- 4.7% at 4 years, 40.3 +/- 5.0% at 5 years, and 33.3 +/- 5.2% at 6 years. The mean follow-up period until sudden death (1.32 +/- 1.11 years) was shorter than that for dyspnea (2.44 +/- 1.84 years), syncope (2.87 +/- 1.26 years) and angina (3.03 +/- 1.68 years). Cox regression analysis disclosed only reduced but within normal limits ejection fraction as independent predictor of total events. Conclusions: Management on ""wait-for-events"" strategy is generally safe. Progressive left ventricular ejection fraction reduction even within normal limits identified patients at high risk for events in whom valve replacement surgery should be considered. (c) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background We validated a strategy for diagnosis of coronary artery disease ( CAD) and prediction of cardiac events in high-risk renal transplant candidates ( at least one of the following: age >= 50 years, diabetes, cardiovascular disease). Methods A diagnosis and risk assessment strategy was used in 228 renal transplant candidates to validate an algorithm. Patients underwent dipyridamole myocardial stress testing and coronary angiography and were followed up until death, renal transplantation, or cardiac events. Results The prevalence of CAD was 47%. Stress testing did not detect significant CAD in 1/3 of patients. The sensitivity, specificity, and positive and negative predictive values of the stress test for detecting CAD were 70, 74, 69, and 71%, respectively. CAD, defined by angiography, was associated with increased probability of cardiac events [log-rank: 0.001; hazard ratio: 1.90, 95% confidence interval (CI): 1.29-2.92]. Diabetes (P=0.03; hazard ratio: 1.58, 95% CI: 1.06-2.45) and angiographically defined CAD (P=0.03; hazard ratio: 1.69, 95% CI: 1.08-2.78) were the independent predictors of events. Conclusion The results validate our observations in a smaller number of high-risk transplant candidates and indicate that stress testing is not appropriate for the diagnosis of CAD or prediction of cardiac events in this group of patients. Coronary angiography was correlated with events but, because less than 50% of patients had significant disease, it seems premature to recommend the test to all high-risk renal transplant candidates. The results suggest that angiography is necessary in many high-risk renal transplant candidates and that better noninvasive methods are still lacking to identify with precision patients who will benefit from invasive procedures. Coron Artery Dis 21: 164-167 (C) 2010 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
Resumo:
Left ventricular hypertrophy is an important predictor of cardiovascular risk and sudden death. This study explored the ability of four obesity indexes (body mass index, waist circumference, waist-hip ratio and waist-stature ratio) to identify left ventricular hypertrophy. A sample of the general population (n=682; 43.5% men) was surveyed to assess cardiovascular risk factors. Biochemical, anthropometric and blood pressure values were obtained in a clinic visit according to standard methods. Left ventricular mass was obtained from transthoracic echocardiogram. Left ventricular hypertrophy was defined using population-specific cutoff values for left ventricular mass indexed to height(2.7). The waist-stature ratio showed the strongest positive association with left ventricular mass. This correlation was stronger in women, even after controlling for age and systolic blood pressure. By multivariate analysis, the main predictors of left ventricular hypertrophy were waist-stature ratio (23%), systolic blood pressure (9%) and age (2%) in men, and waist-stature ratio (40%), age (6%) and systolic blood pressure (2%) in women. Receiver-operating characteristic curves showed the optimal cutoff values of the different anthropometric indexes associated with left ventricular hypertrophy. The waist-stature ratio was a significantly better predictor than the other indexes (except for the waist-hip ratio), independent of gender. It is noteworthy that a waist-stature ratio cutoff of 0.56 showed the highest combined sensitivity and specificity to detect left ventricular hypertrophy. Abdominal obesity identified by waist-stature ratio instead of overall obesity identified by body mass index is the simplest and best obesity index for assessing the risk of left ventricular hypertrophy, is a better predictor in women and has an optimal cutoff ratio of 0.56. Hypertension Research (2010) 33, 83-87; doi: 10.1038/hr.2009.188; published online 13 November 2009
Resumo:
Background-Peculiar aspects of Chagas cardiomyopathy raise concerns about efficacy and safety of sympathetic blockade. We studied the influence of beta-blockers in patients with Chagas cardiomyopathy. Methods and Results-We examined REMADHE trial and grouped patients according to etiology (Chagas versus non-Chagas) and beta-blocker therapy. Primary end point was all-cause mortality or heart transplantation. Altogether 456 patients were studied; 27 (5.9%) were submitted to heart transplantation and 202 (44.3%) died. Chagas etiology was present in 68 (14.9%) patients; they had lower body mass index (24.1+/-4.1 versus 26.3+/-5.1, P=0.001), smaller end-diastolic left ventricle diameter (6.7+/-1.0 mm versus 7.0+/-0.9 mm, P=0.001), smaller proportion of beta-blocker therapy (35.8% versus 68%, P<0.001), and higher proportion of spironolactone therapy (74.6% versus 57.8%, P=0.003). Twenty-four (35.8%) patients with Chagas disease were under beta-blocker therapy and had lower serum sodium (136.6+/-3.1 versus 138.4+/-3.1 mEqs, P=0.05) and lower body mass index (22.5+/-3.3 versus 24.9+/-4.3, P=0.03) compared with those who received beta-blockers. Survival was lower in patients with Chagas heart disease as compared with other etiologies. When only patients under beta-blockers were considered, the survival of patients with Chagas disease was similar to that of other etiologies. The survival of patients with beta-blockers was higher than that of patients without beta-blockers. In Cox regression model, left ventricle end-diastolic diameter (hazard ratio, 1.78; CI, 1.15 to 2.76; P=0.009) and beta-blockers (hazard ratio, 0.37; CI, 0.14 to 0.97; P=0.044) were associated with better survival. Conclusions-Our study suggests that beta-blockers may have beneficial effects on survival of patients with heart failure and Chagas heart disease and warrants further investigation in a prospective, randomized trial.
Resumo:
Objective Cardiovascular risk factors were surveyed in two Indian populations (Guarani, n=60; Tupinikin, n=496) and in a non-Indian group (n=114) living in the same reserve in southeast Brazilian coast. The relationship between an age-dependent blood pressure (BP) increase with salt consumption was also investigated. Methods Overnight (12 h) urine was collected to evaluate Na excretion. Fasting glucose and lipids, anthropometry, BP, ECG and carotid-femoral pulse wave velocity (PWV) were measured in a clinic visit. Participation (318 men/352 women, age 20-94 years; mean=37.6 +/- 14.9 years) comprised 80% of the eligible population. Results The prevalence of hypertension, diabetes and high cholesterol was similar in Tupinikins and in non-Indians and higher than in Guaranis. The prevalence of smoking and obesity was higher in the latter group. Hypertension and diabetes were detected in only one individual of the Guarani group. Mean BP adjusted to age and BMI was significantly lower (P<0.01) in Guaranis (82.8 +/- 1.6 mmHg) than in Tupinikins (92.3 +/- 0.5 mmHg) and non-Indians (91.6 +/- 1.1 mmHg). Urinary Na excretion (mEq/12h), however, was similar in the three groups (Guarani=94 +/- 40; Tupinikin=105 +/- 56; non-Indian=109 +/- 55; P>0.05). PWV (m/s) was lower (P<0.01) in Guarani (7.5 +/- 1.4) than in Tupinikins (8.8 +/- 2.2) and non-Indians (8.4 +/- 2.0). Multiple regression analysis showed that age and waist-to-hip ratio (WHR) were independent predictors of SBP and DBP (r(2)=0.44) in Tupinikins, whereas the WHR was the unique independent predictor of BP variability in Guaranis (r(2)=0.22). Conclusion Lower BP levels in Guaranis cannot be explained by low salt intake observed in other primitive populations. J Hypertens 27:1753-1760 (C) 2009 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.