841 resultados para PSYCHOLOGICAL-FACTORS INCREASE
Resumo:
Over the last couple of decades, the UK experienced a substantial increase in the incidence and geographical spread of bovine tuberculosis (TB), in particular since the epidemic of foot-and-mouth disease (FMD) in 2001. The initiation of the Randomized Badger Culling Trial (RBCT) in 1998 in south-west England provided an opportunity for an in-depth collection of questionnaire data (covering farming practices, herd management and husbandry, trading and wildlife activity) from herds having experienced a TB breakdown between 1998 and early 2006 and randomly selected control herds, both within and outside the RBCT (the so-called TB99 and CCS2005 case-control studies). The data collated were split into four separate and comparable substudies related to either the pre-FMD or post-FMD period, which are brought together and discussed here for the first time. The findings suggest that the risk factors associated with TB breakdowns may have changed. Higher Mycobacterium bovis prevalence in badgers following the FMD epidemic may have contributed to the identification of the presence of badgers on a farm as a prominent TB risk factor only post-FMD. The strong emergence of contact/trading TB risk factors post-FMD suggests that the purchasing and movement of cattle, which took place to restock FMD-affected areas after 2001, may have exacerbated the TB problem. Post-FMD analyses also highlighted the potential impact of environmental factors on TB risk. Although no unique and universal solution exists to reduce the transmission of TB to and among British cattle, there is an evidence to suggest that applying the broad principles of biosecurity on farms reduces the risk of infection. However, with trading remaining as an important route of local and long-distance TB transmission, improvements in the detection of infected animals during pre- and post-movement testing should further reduce the geographical spread of the disease.
Resumo:
When masculine forms are used to refer to men and women, this causes male-biased cognitive representations and behavioral consequences, as numerous studies have shown. This effect can be avoided or reduced with the help of gender-fair language. In this talk, we will present different approaches that aim at influencing people’s use of and attitudes towards gender-fair language. Firstly, we tested the influence of gender-fair input on people’s own use of gender-fair language. Based on Irmen and Linner’s (2005) adaptation of the scenario mapping and focus approach (Sanford & Garrod, 1998), we found that after reading a text with gender-fair forms women produced more gender-fair forms than women who read gender-neutral texts or texts containing masculine generics. Men were not affected. Secondly, we examined reactions to arguments which followed the Elaboration Likelihood Model (Petty &Cacioppo, 1986). We assumed that strong pros and cons would be more effective than weak arguments or control statements. The results indicated that strong pros could convince some, but not all participants, suggesting a complex interplay of diverse factors in reaction to attempts at persuasion. The influence of people’s initial characteristics will be discussed. Currently, we are investigating how self-generated refutations, in addition to arguments, may influence initial attitudes. Based on the resistance appraisal hypothesis (Tormala, 2008), we assume that individuals are encouraged in their initial attitude if they manage to refute strong counter-arguments. The results of our studies will be discussed regarding their practical implications.
Resumo:
Both, psychosocial stress and exercise in the past have been used as stressors to elevate saliva cortisol and change state anxiety levels. In the present study, high-school students at the age of 14 were randomly assigned to three experimental groups: (1) an exercise group (n = 18), that was running 15 minutes at a medium intensity level of 65-75% HRmax, (2) a psychosocial stress group (n = 19), and (3) a control group (n = 18). The psychosocial stress was induced to the students by completing a standardized intelligence test under the assumption that their IQ scores would be made public in class. Results display that only psychosocial stress but not exercise was able to significantly increase cortisol levels but decreased cognitive state anxiety in adolescents. The psychosocial stress protocol applied here is proposed for use in future stress studies with children or adolescents in group settings, e.g., in school.
Resumo:
OBJECTIVE: Occupational low back pain (LBP) is considered to be the most expensive form of work disability, with the socioeconomic costs of persistent LBP exceeding the costs of acute and subacute LBP by far. This makes the early identification of patients at risk of developing persistent LBP essential, especially in working populations. The aim of the study was to evaluate both risk factors (for the development of persistent LBP) and protective factors (preventing the development of persistent LBP) in the same cohort. PARTICIPANTS: An inception cohort of 315 patients with acute to subacute or with recurrent LBP was recruited from 14 health practitioners (twelve general practitioners and two physiotherapists) across New Zealand. METHODS: Patients with persistent LBP at six-month follow-up were compared to patients with non-persistent LBP looking at occupational, psychological, biomedical and demographic/lifestyle predictors at baseline using multiple logistic regression analyses. All significant variables from the different domains were combined into a one predictor model. RESULTS: A final two-predictor model with an overall predictive value of 78% included social support at work (OR 0.67; 95%CI 0.45 to 0.99) and somatization (OR 1.08; 95%CI 1.01 to 1.15). CONCLUSIONS: Social support at work should be considered as a resource preventing the development of persistent LBP whereas somatization should be considered as a risk factor for the development of persistent LBP. Further studies are needed to determine if addressing these factors in workplace interventions for patients suffering from acute, subacute or recurrent LBP prevents subsequent development of persistent LBP.
Resumo:
The Health Belief Model (HBM) provided the theoretical framework for examining Universal Precautions (UP) compliance factors by Emergency Department nurses. A random sample of Emergency Nurses Association (ENA) clinical nurses (n = 900) from five states (New York, New Jersey, California, Texas, and Florida), were surveyed to explore the factors related to their decision to comply with UP. Five-hundred-ninety-eight (598) useable questionnaires were analyzed. The responders were primarily female (84.9%), hospital based (94.6%), staff nurses (66.6%) who had a mean 8.5 years of emergency nursing experience. The nurses represented all levels of hospitals from rural (4.5%) to urban trauma centers (23.7%). The mean UP training hours was 3.0 (range 0-38 hours). Linear regression was used to analyze the four hypotheses. The first hypothesis evaluating perceived susceptibility and seriousness with reported UP use was not significant (p = $>$.05). Hypothesis 2 tested perceived benefits with internal and external barriers. Both perceived benefits and internal barriers as well as the overall regression were significant (F = 26.03, p = $<$0.001). Hypothesis 3 which tested modifying factors, cues to action, select demographic variables, and the main effects of the HBM with self reported UP compliance, was also significant (F = 12.39, p = $<$0.001). The additive effects were tested by use of a stepwise regression that assessed the contribution of each of the significant variables. The regression was significant (F = 12.39, p = $<$0.001) and explained 18% of the total variance. In descending order of contribution, the significant variables related to compliance were: internal barriers (t = $-$6.267; p = $<$0.001) such as the perception that because of the nature of the emergency care environment there is sometimes inadequate time to put on UP; cues to action (t = 3.195; p = 0.001) such as posted reminder signs or verbal reminders from peers; the number of Universal Precautions training hours (t = 3.667; p = $<$0.001) meaning that as the number of training hours increase so does compliance; perceived benefits (t = 3.466; p = 0.001) such as believing that UP will provide adequate barrier protection; and perceived susceptibility (t = 2.880; p = 0.004) such as feeling that they are at risk of exposure. ^
Resumo:
The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^
Resumo:
Bacillus anthracis plasmid pXO1 carries genes for three anthrax toxin proteins, pag (protective antigen), cya (edema factor), and lef (lethal factor). Expression of the toxin genes is enhanced by two signals: CO$\sb2$/bicarbonate and temperature. The CO$\sb2$/bicarbonate effect requires the presence of pXO1. I hypothesized that pXO1 harbors a trans-acting regulatory gene(s) required for CO$\sb2$/bicarbonate-enhanced expression of the toxin genes. Characterization of such a gene(s) will lead to increased understanding of the mechanisms by which B. anthracis senses and responds to host environments.^ A regulatory gene (atxA) on pXO1 was identified. Transcription of all three toxin genes is decreased in an atxA-null mutant. There are two transcriptional start sites for pag. Transcription from the major site, P1, is enhanced in elevated CO$\sb2$. Only P1 transcripts are significantly decreased in the atxA mutant. Deletion analysis of the pag upstream region indicates that the 111-bp region upstream of the P1 site is sufficient for atxA-mediated increase of this transcript. The cya and lef genes each have one apparent transcriptional start site. The cya and lef transcripts are significantly decreased in the atxA mutant. The atxA mutant is avirulent in mice. The antibody response to all three toxin proteins is significantly decreased in atxA mutant-infected mice. These data suggest that the atxA gene product activates expression of the toxin genes and is essential for virulence.^ Since expression of the toxin genes is dependent on atxA, whether increased toxin gene expression in response to CO$\sb2$/bicarbonate and temperature is associated with increased atxA expression was investigated. I monitored steady state levels of atxA mRNA and AtxA protein in different growth conditions. The results indicate that expression of atxA is not influenced by CO$\sb2$/bicarbonate. Steady state levels of atxA mRNA and AtxA protein are higher at 37$\sp\circ$C than 28$\sp\circ$C. However, increased pag expression at high temperature can not be attributed directly to increased atxA expression.^ There is evidence that an additional factor(s) may be involved in regulation of pag. Expression of pag in strains overproducing AtxA is significantly decreased compared to the wildtype strain. A specific interaction of tagged-AtxA with the pag upstream DNA has not been demonstrated. Furthermore, four proteins in B. anthracis extract can be co-immunoprecipitated with tagged-AtxA. Amino-terminal sequence of one protein has been determined and found highly homologous to chaperonins of GroEL family. Studies are under way to determine if this GroEL-like protein interactions with AtxA and plays any role in atxA-mediated activation of toxin genes. ^
Resumo:
The cause of testicular cancer is not known and recent hypotheses have suggested an altered hormonal milieu may increase the risk of testis cancer. This study examined modulation of testicular cancer risk by hormonal factors, specifically: environmental xenoestrogens (e.g. organochlorines), prenatal maternal estrogens, testosterone indices (age at puberty, severe adolescent acne, self-reported balding), sedentary lifestyle and dietary consumption of fats and phytoestrogens.^ A hospital based friend matched case-control study was conducted at the University of Texas M. D. Anderson Cancer Center in Houston, Texas, between January 1990 and October 1996. Cases had a first primary testis tumor diagnosed between age 18 to 50 years and resided in Texas, Louisiana, Oklahoma or Arkansas.^ Cases and friend controls completed a mail questionnaire and case/control mothers were contacted by phone regarding pregnancy related variables. The study population comprised 187 cases, 148 controls, 147 case mothers and 86 control mothers. Odds ratios were virtually identical whether the match was retained or dissolved, thus the analyses were conducted using unconditional logistic regression.^ Cryptorchidism was a strong risk factor for testis cancer with an age-adjusted odds ratio (OR) of 7.7 (95% confidence interval (CI): 2.3-26.3). In a final model (adjusted for age, education, and cryptorchidism), history of severe adolescent acne and self-reported balding were both significantly protective, as hypothesized. For acne (yes vs. no) the OR was 0.5 (CI: 0.3-1.0) and for balding (yes vs. no) the OR was 0.6 (CI: 0.3-1.0). Marijuana smoking was a risk factor among heavy, regular users (17 times/week, OR = 2.4; CI: 0.9-6.4) and higher saturated fat intake increased testis cancer risk (saturated fat intake $>$ 15.2 grams/day vs. $<$ 11.8 grams/day, OR = 3.3; CI: 1.5-7.1). Early puberty, xenoestrogen exposure, elevated maternal estrogen levels, sedentary lifestyle and dietary phytoestrogen intake were not associated with risk of testicular cancer.^ In conclusion, testicular cancer may be associated with endogenous androgen metabolism although environmental estrogen exposure can not be ruled out. Further research is needed to understand the underlying hormonal mechanisms and possible dietary influences. ^
Resumo:
Despite the increase in divorces after a long relationship, this trend remains a neglected research topic. The present contribution seeks to identify patterns of psychological adaptation to divorce after a long-term marriage. Data from a questionnaire study with 308 persons aged 45–65 years, who divorced after having been married for an average of 25 years, are presented. Exploratory latent profile analysis with various well-being outcomes revealed five groups: one with average adapted, one with resilients, and three small groups with seriously affected individuals. Discriminant variables between the groups were personality, time since separation, a new relationship, and financial situation. Age, gender, and length of marriage played a marginal role; satisfaction with the former marriage and initiator status were not relevant.
Resumo:
Spousal loss is an inevitable critical life event for most individuals in old age, mostly associated with a negative impact on various well-being measures, ie. lower life satisfaction, higher rates of loneliness and depressive symptoms compared to married peers. While the negative effects on well-being are well documented in literature, the modifying factors accounting for the large variability in adaptation to loss are discussed controversially. The potential relevance of personality in the adaptation process has rarely been examined and findings regarding the role of time since loss are contradictory. Based on a vulnerability-stress-model this contribution aims a) to compare psychological well-being of bereaved individuals with married counterparts and b) to investigate the protective effects of different personality traits (Big Five, resilience), and the role of time since loss for adaptation in terms of life satisfaction, loneliness and depression. Data from a questionnaire study about the loss of a spouse in middle and old age in the German- and French-speaking parts of Switzerland are reported. The study is part of the Swiss National Centre of Competence in Research LIVES (Swiss National Science Foundation). The sample consists of 351 widowed persons (39% men, widowed since 0 - 5 years), and 605 married controls (50% men), aged 60 - 89 years. Group comparisons reveal the detrimental effect of spousal bereavement on all indicators of psychological adaptation. Results from hierarchical regression analyses show furthermore, that the effect of spousal loss on all psychological outcomes is moderated by personality traits. Separate analyses with the group of bereaved individuals suggest, that the protective effect of personality varies by the time passed since loss. Our results contribute to a better understanding of the variability in psychological adaptation to spousal loss in old age and give hints for counselling practice.
Resumo:
Marital breakup is among the most incisive stressors in adult life. While the negative effects of divorce on well-being are well documented in research literature, the large interindividual differences in psychological adaptation to marital dissolution are still not well understood. One major controversially discussed question is, whether marital dissolution represents a temporary crisis or rather a chronic strain. But also with regard of the role of gender results are mixed. The aim of the present study is to investigate the psychological adaptation (depression, perceived stress and life satisfaction) to marital breakup in a sample of 980 middle-aged persons (M = 51.8 years) who were partnered on average 19.4 years. We compared four time groups: one with a separation within the last 12 months (84 women, 36 men), another within the last 13-24 months (75 women, 19 men), a third within 25-60 months (121 women, 49 men), and one with a separation more than 60 months ago (189 women, 144 men). A group of 348 age-matched married people served as control group (189 women, 159 men). Findings from ANOVA with the outcome depression as well as with the outcome perceived stress yielded a significant main effect for both factors (gender and time). The group who had experienced a separation within the last 12 months differed significantly from all other groups (higher depression score and higher perceived stress). No significant main effect for the factor time was found for the outcome life satisfaction. Regarding gender differences, females from all time groups displayed higher depression scores and higher perceived stress but lower life satisfaction than males. These results give important insights into the process of adaptation to marital breakup, which can be used for counselling.
Resumo:
The causes of a greening trend detected in the Arctic using the normalized difference vegetation index (NDVI) are still poorly understood. Changes in NDVI are a result of multiple ecological and social factors that affect tundra net primary productivity. Here we use a 25 year time series of AVHRR-derived NDVI data (AVHRR: advanced very high resolution radiometer), climate analysis, a global geographic information database and ground-based studies to examine the spatial and temporal patterns of vegetation greenness on the Yamal Peninsula, Russia. We assess the effects of climate change, gas-field development, reindeer grazing and permafrost degradation. In contrast to the case for Arctic North America, there has not been a significant trend in summer temperature or NDVI, and much of the pattern of NDVI in this region is due to disturbances. There has been a 37% change in early-summer coastal sea-ice concentration, a 4% increase in summer land temperatures and a 7% change in the average time-integrated NDVI over the length of the satellite observations. Gas-field infrastructure is not currently extensive enough to affect regional NDVI patterns. The effect of reindeer is difficult to quantitatively assess because of the lack of control areas where reindeer are excluded. Many of the greenest landscapes on the Yamal are associated with landslides and drainage networks that have resulted from ongoing rapid permafrost degradation. A warming climate and enhanced winter snow are likely to exacerbate positive feedbacks between climate and permafrost thawing. We present a diagram that summarizes the social and ecological factors that influence Arctic NDVI. The NDVI should be viewed as a powerful monitoring tool that integrates the cumulative effect of a multitude of factors affecting Arctic land-cover change.
Resumo:
Systematic differences in circadian rhythmicity are thought to be a substantial factor determining inter-individual differences in fatigue and cognitive performance. The synchronicity effect (when time of testing coincides with the respective circadian peak period) seems to play an important role. Eye movements have been shown to be a reliable indicator of fatigue due to sleep deprivation or time spent on cognitive tasks. However, eye movements have not been used so far to investigate the circadian synchronicity effect and the resulting differences in fatigue. The aim of the present study was to assess how different oculomotor parameters in a free visual exploration task are influenced by: a) fatigue due to chronotypical factors (being a 'morning type' or an 'evening type'); b) fatigue due to the time spent on task. Eighteen healthy participants performed a free visual exploration task of naturalistic pictures while their eye movements were recorded. The task was performed twice, once at their optimal and once at their non-optimal time of the day. Moreover, participants rated their subjective fatigue. The non-optimal time of the day triggered a significant and stable increase in the mean visual fixation duration during the free visual exploration task for both chronotypes. The increase in the mean visual fixation duration correlated with the difference in subjectively perceived fatigue at optimal and non-optimal times of the day. Conversely, the mean saccadic speed significantly and progressively decreased throughout the duration of the task, but was not influenced by the optimal or non-optimal time of the day for both chronotypes. The results suggest that different oculomotor parameters are discriminative for fatigue due to different sources. A decrease in saccadic speed seems to reflect fatigue due to time spent on task, whereas an increase in mean fixation duration a lack of synchronicity between chronotype and time of the day.
Resumo:
A large body of empirical research shows that psychosocial risk factors (PSRFs) such as low socio-economic status, social isolation, stress, type-D personality, depression and anxiety increase the risk of incident coronary heart disease (CHD) and also contribute to poorer health-related quality of life (HRQoL) and prognosis in patients with established CHD. PSRFs may also act as barriers to lifestyle changes and treatment adherence and may moderate the effects of cardiac rehabilitation (CR). Furthermore, there appears to be a bidirectional interaction between PSRFs and the cardiovascular system. Stress, anxiety and depression affect the cardiovascular system through immune, neuroendocrine and behavioural pathways. In turn, CHD and its associated treatments may lead to distress in patients, including anxiety and depression. In clinical practice, PSRFs can be assessed with single-item screening questions, standardised questionnaires, or structured clinical interviews. Psychotherapy and medication can be considered to alleviate any PSRF-related symptoms and to enhance HRQoL, but the evidence for a definite beneficial effect on cardiac endpoints is inconclusive. A multimodal behavioural intervention, integrating counselling for PSRFs and coping with illness should be included within comprehensive CR. Patients with clinically significant symptoms of distress should be referred for psychological counselling or psychologically focused interventions and/or psychopharmacological treatment. To conclude, the success of CR may critically depend on the interdependence of the body and mind and this interaction needs to be reflected through the assessment and management of PSRFs in line with robust scientific evidence, by trained staff, integrated within the core CR team.
Resumo:
Background: While the negative effects of spousal bereavement on well-being are well documented in empirical research, the large individual differences in psychological adaptation are still not well understood. Objective: This contribution aims to identify patterns of psychological adaptation to spousal loss in old age and to shed light on the role of intra- and interpersonal resources and contextual factors as discriminant variables among these patterns. Methods: The data stem from a cross-sectional questionnaire study of 402 widowed individuals (228 women, 174 men) aged between 60 and 89 years (mean age 74.41 years), who lost their partner within the last 5 years, and 618 married individuals, who served as controls (312 women, 306 men; mean age 73.82 years). Results: The exploratory latent profile analysis of the well-being outcomes of depressive symptoms, hopelessness, loneliness, life satisfaction and subjective health revealed three different groups in the widowed sample: ‘resilients' (54% of the sample), ‘copers' (39%) and ‘vulnerables' (7%). The most important variables for group allocation were intrapersonal resources - psychological resilience and the Big Five personality traits - but also the quality of the former relationship and how the loss was experienced. Conclusion: Successful adaptation to spousal loss is primarily associated with high scores in psychological resilience and extraversion and low scores in neuroticism. Our results shed light on the variability in psychological adaptation and underline the important role of intrapersonal resources in facing spousal loss in old age.