970 resultados para disease stage
Resumo:
Determinar el impacto de las enfermedades crónicas y el número de enfermedades en los diversos aspectos de la calidad de vida relacionada con la salud (HRQOL) en adultos mayores de São Paulo, Brasil. MÉTODOS: Se empleó la encuesta de salud SF-36® para evaluar el impacto de las enfermedades crónicas de mayor prevalencia sobre la HRQOL. Se realizó un estudio poblacional transversal con un muestreo por conglomerados estratificado en dos etapas. Se obtuvieron los datos de una encuesta multicéntrica sobre la salud aplicada mediante entrevistas en hogares de varios municipios del estado de São Paulo. Se evaluaron siete enfermedades -artritis, dolor de espalda, depresión/ansiedad, diabetes, hipertensión arterial, osteoporosis y accidentes cerebrovasculares- y sus efectos sobre la calidad de vida. RESULTADOS: De los 1 958 adultos mayores de 60 años o más, 13,6% informaron no padecer ninguna de las enfermedades, mientras 45,7% presentaron tres enfermedades crónicas o más. La presencia de cualquiera de las siete enfermedades crónicas estudiadas influyó significativamente en la puntuación de casi todas las escalas de la SF-36®. La HRQOL alcanzó valores inferiores cuando la persona tenía depresión/ansiedad, osteoporosis o había sufrido un accidente cerebrovascular. A mayor número de enfermedades, mayores eran los efectos negativos en las dimensiones de la SF-36®. La presencia de tres enfermedades o más afectó significativamente la HRQOL en todas las áreas. Las escalas más afectadas por las enfermedades fueron dolor físico, salud general y vitalidad. CONCLUSIONES: Se encontró una alta prevalencia de enfermedades crónicas en la población de adultos mayores; la magnitud del efecto sobre la HRQOL dependió del tipo de enfermedad. Estos resultados destacan la importancia de prevenir y controlar las enfermedades crónicas para reducir la comorbilidad y disminuir su impacto sobre la HRQOL en los adultos mayores
Resumo:
Background The Western diet plays a role for the epidemics of obesity and related diseases. This study examined a possible association between peripheral arterial disease (PAD) and the dietary components of Japanese immigrants living in Brazil. Methods and Results In this cross-sectional study, 1,267 subjects (aged =30 years) with complete dietary, clinical and laboratory data were studied according to a standardized protocol. Ankle-to-brachial index was used to identify subjects with PAD. The overall prevalence of PAD was 14.6%. Subjects with PAD were older, had lower education and higher mean values of blood pressure, triglycerides, and fasting and 2-h plasma glucose levels compared with those without the disease. Among the subjects with PAD, the consumption of fiber from whole grains (3.0 vs 3.4 g, p=0.001) and linoleic acids (11.0 vs 11.7 g, p=0.017) were lower and intake of total (72.8 vs 69.1 g, p=0.016) and saturated fatty acids (17.4 vs 16.3 g, p=0.012) were higher than those without PAD. Results of multiple logistic regression analysis showed a significant association between PAD with high total fat intake, low intake of fiber from fruit and oleic acid, independently of other variables. Conclusions Despite limitations in examining the cause - effect relationship, the data support the notion that diet could be important in reducing the occurrence of PAD
Resumo:
Background: Chronic Chagas disease cardiomyopathy (CCC) is an inflammatory dilated cardiomyopathy with a worse prognosis than other cardiomyopathies. CCC occurs in 30 % of individuals infected with Trypanosoma cruzi, endemic in Latin America. Heart failure is associated with impaired energy metabolism, which may be correlated to contractile dysfunction. We thus analyzed the myocardial gene and protein expression, as well as activity, of key mitochondrial enzymes related to ATP production, in myocardial samples of end-stage CCC, idiopathic dilated (IDC) and ischemic (IC) cardiomyopathies. Methodology/Principal Findings: Myocardium homogenates from CCC (N = 5), IC (N = 5) and IDC (N = 5) patients, as well as from heart donors (N = 5) were analyzed for protein and mRNA expression of mitochondrial creatine kinase (CKMit) and muscular creatine kinase (CKM) and ATP synthase subunits aplha and beta by immunoblotting and by real-time RT-PCR. Total myocardial CK activity was also assessed. Protein levels of CKM and CK activity were reduced in all three cardiomyopathy groups. However, total CK activity, as well as ATP synthase alpha chain protein levels, were significantly lower in CCC samples than IC and IDC samples. CCC myocardium displayed selective reduction of protein levels and activity of enzymes crucial for maintaining cytoplasmic ATP levels. Conclusions/Significance: The selective impairment of the CK system may be associated to the loss of inotropic reserve observed in CCC. Reduction of ATP synthase alpha levels is consistent with a decrease in myocardial ATP generation through oxidative phosphorylation. Together, these results suggest that the energetic deficit is more intense in the myocardium of CCC patients than in the other tested dilated cardiomyopathies.
Resumo:
Objectives - A highly adaptive aspect of human memory is the enhancement of explicit, consciously accessible memory by emotional stimuli. We studied the performance of Alzheimer`s disease (AD) patients and elderly controls using a memory battery with emotional content, and we correlated these results with the amygdala and hippocampus volume. Methods - Twenty controls and 20 early AD patients were subjected to the International Affective Picture System (IAPS) and to magnetic resonance imaging-based volumetric measurements of the medial temporal lobe structures. Results - The results show that excluding control group subjects with 5 or more years of schooling, both groups showed improvement with pleasant or unpleasant figures for the IAPS in an immediate free recall test. Likewise, in a delayed free recall test, both the controls and the AD group showed improvement for pleasant pictures, when education factor was not controlled. The AD group showed improvement in the immediate and delayed free recall test proportional to the medial temporal lobe structures, with no significant clinical correlation between affective valence and amygdala volume. Conclusion - AD patients can correctly identify emotions, at least at this early stage, but this does not improve their memory performance.
Resumo:
Influence of soybean phenological stage and leaflets age on infection by Phakopsora pachyrhizi This work was conducted to study the influence of soybean growth stage and leaf age on the infection of Phakopsora pachyrhizi, the soybean rust pathogen. Soybean plants (cv. BRS 154 and BRS 258) at the V(3), R(1) and R(5) growth stages were inoculated with a 1 x 10(5) urediniospores per mL suspension. After a period of 24 hours in dew chambers, all plants were removed from the chambers and placed under greenhouse conditions for 20 days. Mean latent period (PLM) and disease severity were estimated. The susceptibility of trifoliate leaves to soybean rust was estimated on cv. BRS 154 at the growth stage R5. Pathogen inoculation was done at the first four trifoliate leaves. Fifteen days after inoculation, leaflets of each trefoil were evaluated for disease severity, lesion mean size and infection frequency. Plants` growth stage did not influence the PLM. Cultivars BRS 154 and BRS 258 presented PLM of 8 and 9 days, respectively. There was no difference in disease severity at the growth stages V(3) and R(1), but those values were higher than at the R(5) growth stage, 8 days after inoculation. The oldest trefoil showed the highest disease values.
Resumo:
The postharvest development of crown rot of bananas depends notably on the fruit susceptibility to this disease at harvest. It has been shown that fruit susceptibility to crown rot is variable and it was suggested that this depends on environmental preharvest factors. However, little is known about the preharvest factors influencing this susceptibility. The aim of this work was to evaluate the extent to which fruit filling characteristics during growth and the fruit development stage influence the banana susceptibility to crown rot. This involved evaluating the influence of (a) the fruit position at different levels of the banana bunch (hands) and (b) changing the source-sink ratio (So-Si ratio), on the fruit susceptibility to crown rot. The fruit susceptibility was determined by measuring the internal necrotic surface (INS) after artificial inoculation of Colletotrichum musae. A linear correlation (r = -0.95) was found between the hand position on the bunch and the INS. The So-Si ratio was found to influence the pomological characteristics of the fruits and their susceptibility to crown rot. Fruits of bunches from which six hands were removed (two hands remaining on the bunch) proved to be significantly less susceptible to crown rot (INS = 138.3 mm 2) than those from bunches with eight hands (INS = 237.9 mm 2). The banana susceptibility to crown rot is thus likely to be influenced by the fruit development stage and filling characteristics. The present results highlight the importance of standardising hand sampling on a bunch when testing fruit susceptibility to crown rot. They also show that hand removal in the field has advantages in the context of integrated pest management, making it possible to reduce fruit susceptibility to crown rot while increasing fruit size.
Resumo:
There is now considerable evidence to suggest that non-demented people with Parkinson's disease (PD) experience difficulties using the morphosyntactic aspects of language. It remains unclear, however, at precisely which point in the processing of morphosyntax, these difficulties emerge. The major objective of the present study was to examine the impact of PD on the processes involved in accessing morphosyntactic information in the lexicon. Nineteen people with PD and 19 matched control subjects participated in the study which employed on-line word recognition tasks to examine morphosyntactic priming for local grammatical dependencies that occur both within (e.g. is going) and across (e.g. she gives) phrasal boundaries (Experiments 1 and 2, respectively). The control group evidenced robust morphosyntactic priming effects that were consistent with the involvement of both pre- (Experiment 1) and post-lexical (Experiment 2) processing routines. Whilst the participants with PD also recorded priming for dependencies within phrasal boundaries (Experiment 1), priming effects were observed over an abnormally brief time course. Further, in contrast to the controls, the PD group failed to record morphosyntactic priming for constructions that crossed phrasal boundaries (Experiment 2). The results demonstrate that attentionally mediated mechanisms operating at both the pre- and post-lexical stages of processing are able to contribute to morphosyntactic priming effects. In addition, the findings support the notion that, whilst people with PD are able to access morphosyntactic information in a normal manner, the time frame in which this information remains available for processing is altered. Deficits may also be experienced at the post-lexical integrational stage of processing.
Resumo:
In recent years, the phrase 'genomic medicine' has increasingly been used to describe a new development in medicine that holds great promise for human health. This new approach to health care uses the knowledge of an individual's genetic make-up to identify those that are at a higher risk of developing certain diseases and to intervene at an earlier stage to prevent these diseases. Identifying genes that are involved in disease aetiology will provide researchers with tools to develop better treatments and cures. A major role within this field is attributed to 'predictive genomic medicine', which proposes screening healthy individuals to identify those who carry alleles that increase their susceptibility to common diseases, such as cancers and heart disease. Physicians could then intervene even before the disease manifests and advise individuals with a higher genetic risk to change their behaviour - for instance, to exercise or to eat a healthier diet - or offer drugs or other medical treatment to reduce their chances of developing these diseases. These promises have fallen on fertile ground among politicians, health-care providers and the general public, particularly in light of the increasing costs of health care in developed societies. Various countries have established databases on the DNA and health information of whole populations as a first step towards genomic medicine. Biomedical research has also identified a large number of genes that could be used to predict someone's risk of developing a certain disorder. But it would be premature to assume that genomic medicine will soon become reality, as many problems remain to be solved. Our knowledge about most disease genes and their roles is far from sufficient to make reliable predictions about a patient’s risk of actually developing a disease. In addition, genomic medicine will create new political, social, ethical and economic challenges that will have to be addressed in the near future.
Resumo:
Movement-related cortical potentials recorded from the scalp reveal increasing cortical activity occurring prior to voluntary movement. Studies of set-related cortical activity recorded from single neurones within premotor and supplementary motor areas in monkeys suggest that such premovement activity may act to prime activity of appropriate motor units in readiness to move, thereby facilitating the movement response. Such a role of early stage premovement activity in movement-related cortical potentials was investigated by examining the relationship between premovement cortical activity and movement initiation or reaction times. Parkinson's disease and control subjects performed a simple button-pressing reaction time task and individual movement-related potentials were averaged for responses with short compared with long reaction times. For Parkinson's disease subjects but not for the control subjects, early stage premovement cortical activity was significantly increased in amplitude for faster reaction times, indicating that there is indeed a relationship between premovement cortical activity amplitude and movement initiation or reaction times. In support of studies of set-related cortical activity in monkeys, it is therefore suggested that early stage premovement activity reflects the priming of appropriate motor units of primary motor cortex, thereby reducing movement initiation or reaction times. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
Hypokinetic movement can be greatly improved in Parkinson's disease patients by the provision of external cues to guide movement. It has recently been reported, however, that movement performance in parkinsonian patients can be similarly improved in the absence of external cues by using attentional strategies, whereby patients are instructed to consciously attend to particular aspects of the movement which would normally be controlled automatically. To study the neurophysiological basis of such improvements in performance associated with the use of attentional strategies, movement-related cortical potentials were examined in Parkinson's disease and control subjects using a reaction time paradigm. One group of subjects were explicitly instructed to concentrate on internally timed responses to anticipate the presentation of a predictably timed go signal. Other subjects were given no such instruction regarding attentional strategies. Early-stage premovement activity of movement-related potentials was significantly increased in amplitude and reaction times were significantly faster for Parkinson's disease subjects when instructed to direct their attention toward internally generating responses rather than relying on external cues. It is therefore suggested that the use of attentional strategies may allow movement to be mediated by less automatic and more conscious attentional motor control processes which may be less impaired by basal ganglia dysfunction, and thereby improve movement performance in Parkinson's disease.
Resumo:
Background and objectives Low bone mineral density and coronary artery calcification (CAC) are highly prevalent among chronic kidney disease (CKD) patients, and both conditions are strongly associated with higher mortality. The study presented here aimed to investigate whether reduced vertebral bone density (VBD) was associated with the presence of CAC in the earlier stages of CKD. Design, setting, participants, & measurements Seventy-two nondialyzed CKD patients (age 52 +/- 11.7 years, 70% male, 42% diabetics, creatinine clearance 40.4 +/- 18.2 ml/min per 1.73 m(2)) were studied. VBD and CAC were quantified by computed tomography. Results CAC > 10 Agatston units (AU) was observed in 50% of the patients (median 120 AU [interquartile range 32 to 584 AU]), and a calcification score >= 400 AU was found in 19% (736 [527 to 1012] AU). VBD (190 +/- 52 Hounsfield units) correlated inversely with age (r = -0.41, P < 0.001) and calcium score (r = -0.31, P = 0.01), and no correlation was found with gender, creatinine clearance, proteinuria, lipid profile, mineral parameters, body mass index, and diabetes. Patients in the lowest tertile of VBD had expressively increased calcium score in comparison to the middle and highest tertile groups. In the multiple logistic regression analysis adjusting for confounding variables, low VBD was independently associated with the presence of CAC. Conclusions Low VBD was associated with CAC in nondialyzed CKD patients. The authors suggest that low VBD might constitute another nontraditional risk factor for cardiovascular disease in CKD. Clin J Am Soc Nephrol 6: 1456-1462, 2011. doi: 10.2215/CJN.10061110
Resumo:
The guidelines proposed by the Kidney Disease Outcomes Quality Initiative (K/DOQI) suggested that intact parathyroid hormone (iPTH) should be maintained in a target range between 150 and 300 pg ml(-1) for patients with stage 5 chronic kidney disease. Our study sought to verify the effectiveness of that range in preventing bone remodeling problems in hemodialysis patients. We measured serum ionized calcium and phosphorus while iPTH was measured by a second-generation assay. Transiliac bone biopsies were performed at the onset of the study and after completing 1 year follow-up. The PTH levels decreased within the target range in about one-fourth of the patients at baseline and at the end of the study. The bone biopsies of two-thirds of the patients were classified as showing low turnover and a one-fourth showed high turnover, the remainder having normal turnover. In the group achieving the target levels of iPTH 88% had low turnover. Intact PTH levels less than 150 pg ml(-1) for identifying low turnover and greater than 300 pg ml(-1) for high turnover presented a positive predictive value of 83 and 62%, respectively. Our study suggests that the iPTH target recommended by the K/DOQI guidelines was associated with a high incidence of low-turnover bone disease, suggesting that other biochemical markers may be required to accurately measure bone-remodeling status in hemodialysis patients.
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Aim: A positive effect of liver transplantation on health-related quality of life (HRQOL) has been well documented in previous studies using generic instruments. Our aim was to re-evaluate different aspects of HRQOL before and after liver transplantation with a relatively new questionnaire the `liver disease quality of life` (LDQOL). Methods: The LDQOL and the Short Form 36 (SF-36) questionnaires were applied to ambulatory patients, either in the transplant list (n=65) or after 6 months to 5 years of liver transplant (n=61). The aetiology of cirrhosis, comorbidities, model for end-stage liver disease (MELD) Child-Pugh scores and recurrence of liver disease after liver transplantation were analysed using the Mann-Whitney and Kruskall-Wallis tests. Results: In patients awaiting liver transplantation, MELD scores >= 15 and Child-Pugh class C showed statistically significant worse HRQOL, using both the SF-36 and the LDQOL questionnaires. HRQOL in pretransplant patients was found to be significantly worse in those with cirrhosis owing to hepatitis C (n=30) when compared with other aetiologies (n=35) in 2/7 domains of the SF-36 and in 7/12 domains of the LDQOL. Significant deterioration of HRQOL after recurrence of hepatitis C post-transplant was detected with the LDQOL questionnaire although not demonstrated with the SF-36. The statistically significant differences were in the LDQOL domains: symptoms of liver disease, concentration, memory and health distress. Conclusions: The LDQOL, a specific instrument for measuring HRQOL, has shown a greater accuracy in relation to liver symptoms and could demonstrate, with better reliability, impairments before and after liver transplantation.
Resumo:
Purpose: Transanal endorectal pull-through (TEPT) has drastically changed the treatment of Hirschsprung`s disease (HD). A short follow-up of children Submitted to TEPT reveals results that are similar to the classic transabdominal pull-through procedures. However, few reports compare the late results of TEPT with transabdominal pull-through procedures with respect to complication rates and the fecal continence. The aims of the present work are to describe some technical refinements that we introduced in the procedure and to compare the short and long-term outcome of TEPT with the outcomes of a group of patients with HD who previously underwent the Duhamel procedure. Methods: Thirty-five patients who underwent TEPT were prospectively studied and compared to a group of 29 patients who were treated with colostomy followed by a classical Duhamel pull-through. The main modifications introduced in the TEPT group were no preoperative colon preparation, operation conducted under general anesthesia in addition to regional sacral anesthesia, use of only one purse-string Suture in the rectal mucosa before transanal submucosal dissection, and no use of retractors and electrocautery during file submucosal dissection. Results: The most frequent early complications of TEPT group were perineal dermatitis (22.8%) and anastomotic strictures (8.6%). The comparison with patients who underwent Duhamel procedure revealed no difference in the incidence of preoperative enterocolitis, the patients of the TEPT group were younger at the time of diagnosis and of surgery, they had shorter operating times, and they began oral feeding more quickly after the operation. The incidence of wound infection was lower in the TEPT group. Moreover, the TEPT and Duhamel groups showed no difference in the incidences of mortality, postoperative partial continence, and total incontinence. Although the incidences of complete continence and postoperative enterocolitis were not different, a tendency to the increased incidence in the TEPT group was observed. Conclusions: This study further supports the technical advantages, the simplicity, and the decreased incidence of complications of a primary TEPT procedure when compared to a classical form of pull-through. Sonic technical refinements are described, and no preoperative colon preparation was necessary for the patients studied here. The results show that the long-term outcomes of the modified TEPT procedure are generally better than those obtained with classical approaches. Published by Elsevier Inc.