934 resultados para perceived stress questionnaire
Resumo:
To study the correlation between caries experience in individuals with cerebral palsy (CP) and the quality of life of their primary caregivers. Sixty-five non-institutionalized individuals, presenting CP, aged 2-21 years old, were evaluated for caries experience. Their respective caregivers aged 20-74 years old answered the Short Form 36 (SF-36) health survey and Independence Measure for Children. Fifty-eight non-disabled individuals (ND group), aged 2-21 years old, and their respective caregivers, aged 25-56 years old, were submitted to the same evaluation process as the CP group. Primary caregivers of CP individuals exhibited significantly lower scores than the ND group in all subscales of the SF-36 health survey questionnaire: physical functioning, physical role, bodily pain, general health, vitality, social functioning, emotional role and mental health. The CP group presented significantly higher values for the Decayed, Missed and Filled (DMF-T) index than the ND group and a significant negative correlation was obtained between the SF-36 and DMF-T index. The results suggest that caregivers of CP individuals exhibited worse quality of life than those of the non-disabled. A negative correlation exists between caries experience of CP individuals and their caregivers` quality of life.
Resumo:
We present a controlled stress microviscometer with applications to complex fluids. It generates and measures microscopic fluid velocity fields, based on dual beam optical tweezers. This allows an investigation of bulk viscous properties and local inhomogeneities at the probe particle surface. The accuracy of the method is demonstrated in water. In a complex fluid model (hyaluronic acid), we observe a strong deviation of the flow field from classical behavior. Knowledge of the deviation together with an optical torque measurement is used to determine the bulk viscosity. Furthermore, we model the observed deviation and derive microscopic parameters.
Resumo:
Background: Obesity and obstructive sleep apnea (OSA) are both associated with the prevalence of major cardiovascular illnesses and certain common factors they are considered responsible for, such as stress oxidative increase, sympathetic tonus and resistance to insulin. Objective: The aim of the present study was to compare the effect of continuous positive airway pressure (CPAP) on oxidative stress and adiponectin levels in obese patients with and without OSA. Methods: Twenty-nine obese patients were categorized into 3 groups: group 1: 10 individuals without OSA (apnea-hypopnea index, AHI <= 5) who did not have OSA diagnosed at polysomnography; group 2: 10 patients with moderate to severe OSA (AHI >= 20) who did not use CPAP; group 3: 9 patients with moderate to severe OSA (AHI >= 20) who used CPAP. Results: Group 3 showed significant differences before and after the use of CPAP, in the variables of diminished production of superoxide, and increased nitrite and nitrate synthesis and adiponectin levels. Positive correlations were seen between the AHI and the superoxide production, between the nitrite and nitrate levels and the adiponectin levels, between superoxide production and the HOMA-IR, and between AHI and the HOMA-IR. Negative correlations were found between AHI and the nitrite and nitrate levels, between the superoxide production and that of nitric oxide, between the superoxide production and the adiponectin levels, between AHI and the adiponectin levels, and between the nitrite and nitrate levels and the HOMA-IR. Conclusions: This study demonstrates that the use of CPAP can reverse the increased superoxide production, the diminished serum nitrite, nitrate and plasma adiponectin levels, and the metabolic changes existing in obese patients with OSA. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
It has been suggested that phosphate binders may reduce the inflammatory state of hemodialysis (HD) patients. However, it is not clear whether it has any effect on oxidative stress. The objective of this study was to evaluate the effect of sevelamer hydrochloride (SH) and calcium acetate (CA) on oxidative stress and inflammation markers in HD patients. Hemodialysis patients were randomly assigned to therapy with SH (n=17) or CA (n=14) for 1 year. Before the initiation of therapy (baseline) and at 12 months, we measured in vitro reactive oxygen species (ROS) production by stimulated and unstimulated polymorphonuclear neutrophils and serum levels of tumor necrosis factor alpha, interleukin-10, C-reactive protein, and albumin. There was a significant reduction of spontaneous ROS production in both groups after 12 months of therapy. There was a significant decrease of Staphylococcus aureus stimulated ROS production in the SH group. There was a significant increase in albumin serum levels only in the SH group. In the SH group, there was also a decrease in the serum levels of tumor necrosis factor alpha and C-reactive protein. Our results suggest that compared with CA treatment, SH may lead to a reduction in oxidative stress and inflammation. Therefore, it is possible that phosphate binders exert pleiotropic effects on oxidative stress and inflammation, which could contribute toward decreasing endothelial injury in patients in HD.
Resumo:
Although a new protocol of dobutamine stress echocardiography with the early injection of atropine (EA-DSE) has been demonstrated to be useful in reducing adverse effects and increasing the number of effective tests and to have similar accuracy for detecting coronary artery disease (CAD) compared with conventional protocols, no data exist regarding its ability to predict long-term events. The aim of this study was to determine the prognostic value of EA-DSE and the effects of the long-term use of beta blockers on it. A retrospective evaluation of 844 patients who underwent EA-DSE for known or suspected CAD was performed; 309 (37%) were receiving beta blockers. During a median follow-up period of 24 months, 102 events (12%) occurred. On univariate analysis, predictors of events were the ejection fraction (p <0.001), male gender (p <0.001), previous myocardial infarction (p <0.001), angiotensin-converting enzyme inhibitor therapy (p = 0.021), calcium channel blocker therapy (p = 0.034), and abnormal results on EA-DSE (p <0.001). On multivariate analysis, the independent predictors of events were male gender (relative risk [RR] 1.78, 95% confidence interval [CI] 1.13 to 2.81, p = 0.013) and abnormal results on EA-DSE (RR 4.45, 95% CI 2.84 to 7.01, p <0.0001). Normal results on EA-DSE with P blockers were associated with a nonsignificant higher incidence of events than normal results on EA-DSE without beta blockers (RR 1.29, 95% CI 0.58 to 2.87, p = 0.54). Abnormal results on EA-DSE with beta blockers had an RR of 4.97 (95% CI 2.79 to 8.87, p <0.001) compared with normal results, while abnormal results on EA-DSE without beta blockers had an RR of 5.96 (95% CI 3.41 to 10.44, p <0.001) for events, with no difference between groups (p = 0.36). In conclusion, the detection of fixed or inducible wall motion abnormalities during EA-DSE was an independent predictor of long-term events in patients with known or suspected CAD. The prognostic value of EA-DSE was not affected by the long-term use of beta blockers. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1291-1295)
Resumo:
Objective: To evaluate the usefulness of gamma-glutamyltransferase (GGT) and mean corpuscular volume (MCV), as well as that of the CAGE questionnaire, in workplace screening for alcohol abuse/dependence. Methods: A total of 183 male employees were submitted to structured interviews (Structured Clinical Interview for DSM-IV 2.0 and CAGE questionnaire). Blood samples were collected. Diagnostic accuracy and odds ratio were determined for the CAGE, GGT and MCV. Results: The CAGE questionnaire presented the best sensitivity for alcohol dependence (91%; specificity, 87.8%) and for alcohol abuse (87.5%, specificity, 80.9%), which increased when the questionnaire was used in combination with GGT (sensitivity, 100% and 87.5%, respectively; specificity, 68% and 61.5, respectively). CAGE positive results and/or alterations in GGT were less likely to occur among employees not presenting alcohol abuse/ dependence than among those presenting such abuse (OR for CAGE = 13, p < 0.05; OR for CAGE-GGT = 11, p < 0.05) or dependence (OR for CAGE = 76, p < 0.0 1; OR for GGT = 5, p < 0.0 1). Employees not presenting alcohol abuse/dependence were also several times more likely to present negative CAGE or GGT results. Conclusions: The use short, simple questionnaires, combined with that of low-cost biochemical markers, such as GGT, can serve as an initial screening for alcohol-related problems, especially for employees in hazardous occupations. The data provided can serve to corroborate clinical findings. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Lentil is a self-pollinating diploid (2n = 14 chromosomes) annual cool season legume crop that is produced throughout the world and is highly valued as a high protein food. Several abiotic stresses are important to lentil yields world wide and include drought, heat, salt susceptibility and iron deficiency. The biotic stresses are numerous and include: susceptibility to Ascochyta blight, caused by Ascochyta lentis; Anthracnose, caused by Colletotrichum truncatum; Fusarium wilt, caused by Fusarium oxysporum; Sclerotinia white mold, caused by Sclerotinia sclerotiorum; rust, caused by Uromyces fabae; and numerous aphid transmitted viruses. Lentil is also highly susceptible to several species of Orabanche prevalent in the Mediterranean region, for which there does not appear to be much resistance in the germplasm. Plant breeders and geneticists have addressed these stresses by identifying resistant/tolerant germplasm, determining the genetics involved and the genetic map positions of the resistant genes. To this end progress has been made in mapping the lentil genome and several genetic maps are available that eventually will lead to the development of a consensus map for lentil. Marker density has been limited in the published genetic maps and there is a distinct lack of co-dominant markers that would facilitate comparisons of the available genetic maps and efficient identification of markers closely linked to genes of interest. Molecular breeding of lentil for disease resistance genes using marker assisted selection, particularly for resistance to Ascochyta blight and Anthracnose, is underway in Australia and Canada and promising results have been obtained. Comparative genomics and synteny analyses with closely related legumes promises to further advance the knowledge of the lentil genome and provide lentil breeders with additional genes and selectable markers for use in marker assisted selection. Genomic tools such as macro and micro arrays, reverse genetics and genetic transformation are emerging technologies that may eventually be available for use in lentil crop improvement.
Resumo:
A history of childhood trauma and the presence of dissociative phenomena are considered to be the most important risk factors for psychogenic nonepileptic seizure disorder (PNESD). This case-control study investigated 20 patients with PNESD and 20 with temporal lobe epilepsy (TLE) diagnosed by video/EEG monitoring who were matched for gender and age. Patients with both conditions were not included in the study. Groups were evaluated for age at onset and at diagnosis, worst lifetime weekly seizure frequency, trauma history, and presence of dissociative phenomena. Age at onset (P = 0.007) and age at diagnosis (P < 0.001) were significantly higher in the PNESD group than the control group, as were the scores on the Dissociative Experiences Scale (P < 0.001) and Childhood Trauma Questionnaire (P = 0.014). Only the differences in scores on the Childhood Trauma Questionnaire subscales Emotional Neglect (P = 0.013) and Emotional Abuse (P = 0.014) reached statistical significance. Dissociative phenomena and a reported history of childhood trauma are more common in patients with PNESD than in those with TLE. However, only emotional neglect and abuse were associated with PNESD in this study. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Recently, stress myocardial computed tomographic perfusion (CTP) was shown to detect myocardial ischemia. Our main objective was to evaluate the feasibility of dipyridamole stress CTP and compare it to single-photon emission computed tomography (SPECT) to detect significant coronary stenosis using invasive conventional coronary angiography (CCA; stenosis >70%) as the reference method. Thirty-six patients (62 +/- 8 years old, 20 men) with previous positive results with SPECT (<2 months) as the primary inclusion criterion and suspected coronary artery disease underwent a customized multidetector-row CT protocol with myocardial perfusion evaluation at rest and during stress and coronary CT angiography (CTA). Multidetector-row computed tomography was performed in a 64-slice scanner with dipyridamole stress perfusion acquisition before a second perfusion/CT angiographic acquisition at rest. Independent blinded observers performed analysis of images from CTP, CTA, and CCA. All 36 patients completed the CT protocol with no adverse events (mean radiation dose 14.7 +/- 3.0 mSv) and with interpretable scans. CTP results were positive in 27 of 36 patients (75%). From the 9 (25%) disagreements, 6 patients had normal coronary arteries and 2 had no significant stenosis (8 false-positive results with SPECT, 22%). The remaining patient had an occluded artery with collateral flow confirmed by conventional coronary angiogram. Good agreement was demonstrated between CTP and SPECT on a per-patient analysis (kappa 0.53). In 26 patients using CCA as reference, sensitivity, specificity, and positive and negative predictive values were 88.0%, 79.3%, 66.7%, and 93.3% for CTP and 68.8, 76.1%, 66.7%, and 77.8%, for SPECT, respectively (p = NS). In conclusion, dipyridamole CT myocardial perfusion at rest and during stress is feasible and results are similar to single-photon emission CT scintigraphy. The anatomical-perfusion information provided by this combined CT protocol may allow identification of false-positive results by SPECT. (C) 2010 Elsevier Inc. All rights reserved. (Am J Cardiol 2010;106:310-315)
Resumo:
The role of physiological understanding in improving the efficiency of breeding programs is examined largely from the perspective of conventional breeding programs. Impact of physiological research to date on breeding programs, and the nature of that research, was assessed from (i) responses to a questionnaire distributed to plant breeders and physiologists, and (ii) a survey of literature abstracts. Ways to better utilise physiological understanding for improving breeding programs are suggested, together with possible constraints to delivering beneficial outcomes. Responses from the questionnaire indicated a general view that the contribution by crop physiology to date has been modest. However, most of those surveyed expected the contribution to be larger in the next 20 years. Some constraints to progress perceived by breeders and physiologists were highlighted. The survey of literature abstracts indicated that from a plant breeding perspective, much physiological research is not progressing further than making suggestions about possible approaches to selection. There was limited evidence in the literature of objective comparison of such suggestions with existing methodology, or of development and application of these within active breeding programs. It is argued in this paper that the development of outputs from physiological research for breeding requires a good understanding of the breeding program(s) being serviced and factors affecting its performance. Simple quantitative genetic models, or at least the ideas they represent, should be considered in conducting physiological research and in envisaging and evaluating outputs. The key steps of a generalised breeding program are outlined, and the potential pathways for physiological understanding to impact on these steps are discussed. Impact on breeding programs may arise through (i) better choice of environments in which to conduct selection trials, (ii) identification of selection criteria and traits for focused introgression programs, and (iii) identifying traits for indirect selection criteria as an adjunct to criteria already used. While many breeders and physiologists apparently recognise that physiological understanding may have a major role in the first area, there appears to be relatively Little research activity targeting this issue, and a corresponding bias, arguably unjustified, toward examining traits for indirect selection. Furthermore, research on traits aimed at crop improvement is often deficient because key genetic parameters, such as genetic variation in relevant breeding populations and genetic (as opposed to phenotypic) correlations with yield or other characters of economic importance, are not properly considered in the research. Some areas requiring special attention for successfully interfacing physiology research with breeding are discussed. These include (i) the need to work with relevant genetic populations, (ii) close integration of the physiological research with an active breeding program, and (iii) the dangers of a pre-defined or narrow focus in the physiological research.
Resumo:
Aim: A positive effect of liver transplantation on health-related quality of life (HRQOL) has been well documented in previous studies using generic instruments. Our aim was to re-evaluate different aspects of HRQOL before and after liver transplantation with a relatively new questionnaire the `liver disease quality of life` (LDQOL). Methods: The LDQOL and the Short Form 36 (SF-36) questionnaires were applied to ambulatory patients, either in the transplant list (n=65) or after 6 months to 5 years of liver transplant (n=61). The aetiology of cirrhosis, comorbidities, model for end-stage liver disease (MELD) Child-Pugh scores and recurrence of liver disease after liver transplantation were analysed using the Mann-Whitney and Kruskall-Wallis tests. Results: In patients awaiting liver transplantation, MELD scores >= 15 and Child-Pugh class C showed statistically significant worse HRQOL, using both the SF-36 and the LDQOL questionnaires. HRQOL in pretransplant patients was found to be significantly worse in those with cirrhosis owing to hepatitis C (n=30) when compared with other aetiologies (n=35) in 2/7 domains of the SF-36 and in 7/12 domains of the LDQOL. Significant deterioration of HRQOL after recurrence of hepatitis C post-transplant was detected with the LDQOL questionnaire although not demonstrated with the SF-36. The statistically significant differences were in the LDQOL domains: symptoms of liver disease, concentration, memory and health distress. Conclusions: The LDQOL, a specific instrument for measuring HRQOL, has shown a greater accuracy in relation to liver symptoms and could demonstrate, with better reliability, impairments before and after liver transplantation.
Resumo:
Background. A sample of 1089 Australian adults was selected for the longitudinal component of the Quake Impact Study, a 2-year, four-phase investigation of the psychosocial effects of the 1989 Newcastle earthquake. Of these, 845 (78%) completed a survey 6 months post-disaster as well as one or more of the three follow-up surveys. Methods. The phase I survey was used to construct dimensional indices of self-reported exposure to threat the disruption and also to classify subjects by their membership of five 'at risk' groups (the injured; the displaced; owners of damaged small businesses; helpers in threat and non-threat situations). Psychological morbidity was assessed at each phase using the 12-item General Health Questionnaire (GHQ-12) and the Impact of Event Scale (IES). Results. Psychological morbidity declined over time but tended to stabilize at about 12 months post-disaster for general morbidity (GHQ-12) and at about 18 months for trauma-related (IES) morbidity. Initial exposure to threat and/or disruption were significant predictors of psychological morbidity throughout the study and had superior predictive power to membership of the targeted 'at risk' groups. The degree of ongoing disruption and other life events since the earthquake were also significant predictors of morbidity. The injured reported the highest levels of distress, but there was a relative absence of morbidity among the helpers. Conclusions. Future disaster research should carefully assess the threat and disruption experiences of the survivors at the time of the event and monitor ongoing disruptions in the aftermath in order to target interventions more effectively.
Resumo:
Background. This paper examines the contributions of dispositional and non-dispositional factors to post-disaster psychological morbidity. Data reported are from the 845 participants in the longitudinal component of the Quake Impact Study. Methods. The phase 1 survey was used to construct dimensional indices of threat and disruption exposure. Subsequently, a range of dispositional characteristics were measured, including neuroticism, personal hopefulness and defence style. The main morbidity measures were the General Health Questionnaire (GHQ-12) and Impact of Event Scale (IES). Results. Dispositional characteristics were the best predictors of psychological morbidity throughout the 2 years post-disaster, contributing substantially more to the variance in morbidity (12-39%) than did initial exposure (5-12%), but the extent of their contribution was greater for general (GHQ-12) than for post-traumatic (IES) morbidity. Among the non-dispositional factors, avoidance coping contributed equally to general and post-traumatic morbidity (pr = 0.24). Life events since the earthquake (pr = 0.18), poor social relationships (pr = -0.25) and ongoing earthquake-related disruptions (pr = 0.22) also contributed to general morbidity, while only the latter contributed significantly to post-traumatic morbidity (pr = 0.15). Conclusions. Medium-term post-earthquake morbidity appears to be a function of multiple factors whose contributions vary depending on the type of morbidity experienced and include trait vulnerability, the nature and degree of initial exposure, avoidance coping and the nature and severity of subsequent events.
Resumo:
Background Brazil has one of the highest stroke mortality rates in the world, these rates are most endemic in the poor. We verified the prevalence of stroke in a deprived neighbourhood in the city of Sao Paulo, Brazil and compared it with other surveys worldwide. Methods A questionnaire with six questions concerning limb and facial weakness, articulation, sensory disturbances, impaired vision, and past diagnosis of stroke was completed door-to-door in a well-defined area of 15 000 people. Questionnaires were considered positive when a participant answered two or more questions about stroke symptoms or the presence of stroke being confirmed by a physician, or at least three questions in the positive, even if not confirmed by a doctor. Results Of the 4496 individuals over 35-years old living in the area, 243 initially screened positive for stroke. The age-adjusted prevalence rate for men was 4 center dot 6% (95% confidence interval 3 center dot 5-5 center dot 7). For women, the prevalence rate was 6 center dot 5% (95% confidence interval 5 center dot 5-7 center dot 5); when considering only one question, the rate was 4 center dot 8% (95% confidence interval 3 center dot 9-5 center dot 7). The most commonly reported symptoms were limb weakness and sensory disturbances. Hypertension and heart disease were the conditions most commonly associated with previous stroke. Conclusion Stroke prevalence rates were higher in this poor neighbourhood compared with other surveys.