911 resultados para Neuropsychological deficits
Resumo:
Approximately one-third of stroke patients experience depression. Stroke also has a profound effect on the lives of caregivers of stroke survivors. However, depression in this latter population has received little attention. In this study the objectives were to determine which factors are associated with and can be used to predict depression at different points in time after stroke; to compare different depression assessment methods among stroke patients; and to determine the prevalence, course and associated factors of depression among the caregivers of stroke patients. A total of 100 consecutive hospital-admitted patients no older than 70 years of age were followed for 18 months after having their first ischaemic stroke. Depression was assessed according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-III-R), Beck Depression Inventory (BDI), Hamilton Rating Scale (HRSD), Visual Analogue Mood Scale (VAMS), Clinical Global Impression (CGI) and caregiver ratings. Neurological assessments and a comprehensive neuropsychological test battery were performed. Depression in caregivers was assessed by BDI. Depressive symptoms had early onsets in most cases. Mild depressive symptoms were often persistent with little change during the 18-month follow-up, although there was an increase in major depression over the same time interval. Stroke severity was associated with depression especially from 6 to 12 months post-stroke. At the acute phase, older patients were at higher risk of depression, and a higher proportion of men were depressed at 18 months post-stroke. Of the various depression assessment methods, none stood clearly apart from the others. The feasibility of each did not differ greatly, but prevalence rates differed widely according to the different criteria. When compared against DSM-III-R criteria, sensitivity and specificity were acceptable for the CGI, BDI, and HRSD. The CGI and BDI had better sensitivity than the more specific HRSD. The VAMS seemed not to be a reliable method for assessing depression among stroke patients. The caregivers often rated patients depression as more severe than did the patients themselves. Moreover, their ratings seemed to be influenced by their own depression. Of the caregivers, 30-33% were depressed. At the acute phase, caregiver depression was associated with the severity of the stroke and the older age of the patient. The best predictor of caregiver depression at later follow-up was caregiver depression at the acute phase. The results suggest that depression should be assessed during the early post-stroke period and that the follow-up of those at risk of poor emotional outcome should be extended beyond the first year post-stroke. Further, the assessment of well-being of the caregivers of stroke patients should be included as a part of a rehabilitation plan for stroke patients.
Resumo:
Primary brain tumors are associated with significant physical, cognitive and psychosocial changes. Although treatment guidelines recommend offering multidisciplinary rehabilitation and support services to address patients’ residual deficits, the extent to which patients access such services is unclear. This study aimed to assess patients’ supportive care needs early after diagnosis, and quantify service awareness, referral and utilization. A population-based sample of 40 adults recently diagnosed with primary brain tumors was recruited through the Queensland Cancer Registry, representing 18.9% of the eligible population of 203 patients. Patients or carer proxies completed surveys of supportive care needs at baseline (approximately three months after diagnosis) and three months later. Descriptive statistics summarized needs and service utilization, and linear regression identified predictors of service use. Unmet supportive care needs were highest at baseline for all domains, and highest for the physical and psychological needs domains at each time point. At follow-up, participants reported awareness of, referral to, and use of 32 informational, support, health professional or practical services. All or almost all participants were aware of at least one informational (100%), health professional (100%), support (97%) or practical service (94%). Participants were most commonly aware of speech therapists (97%), physiotherapists (94%) and diagnostic information from the internet (88%). Clinician referrals were most commonly made to physiotherapists (53%), speech therapists (50%) and diagnostic information booklets (44%), and accordingly, participants most commonly used physiotherapists (56%), diagnostic information booklets (47%), diagnostic information from the internet (47%), and speech therapists (43%). Comparatively low referral to and use of psychosocial services may limit patients’ abilities to cope with their condition and the changes they experience.
Resumo:
The early detection of hearing deficits is important to a child's development. However, examining small children with behavioural methods is often difficult. Research with ERPs (event-related potentials), recorded with EEG (electroencephalography), does not require attention or action from the child. Especially in children's ERP research, it is essential that the duration of a recording session is not too long. A new, faster optimum paradigm has been developed to record MMN (mismatch negativity), where ERPs to several sound features can be recorded in one recording session. This substantially shortens the time required for the experiment. So far, the new paradigm has been used in adult and school-aged children research. This study examines if MMN, LDN (late discriminative negativity) and P3a components can be recorded in two-year-olds with the new paradigm. The standard stimulus (p=0.50) was an 80 dB harmonic tone consisting of three harmonic frequencies (500 Hz, 1000 Hz and 1500 Hz) with a duration of 200 ms. The loudness deviants (p=0.067) were at a level of +6 dB or -6 dB compared to the standards. The frequency deviants (p=0.112) had a fundamental frequency of 550 or 454.4 Hz (small deviation), 625 or 400 Hz (medium deviation) or 750 or 333.3 Hz (large deviation). The duration deviants (p=0.112) had a duration of 175 ms (small deviation), 150 ms (medium deviation) or 100 ms (large deviation). The direction deviants (p=0.067) were presented from the left or right loudspeaker only. The gap deviant (p=0.067) included a 5-ms silent gap in the middle of the sound. Altogether 17 children participated in the experiment, of whom the data of 12 children was used in the analysis. ERP components were observed for all deviant types. The MMN was significant for duration and gap deviants. The LDN was significant for the large duration deviant and all other deviant types. No significant P3a was observed. These results indicate that the optimum paradigm can be used with two-year-olds. With this paradigm, data on several sound features can be recorded in a shorter time than with the previous paradigms used in ERP research.
Resumo:
Intact function of working memory (WM) is essential for children and adults to cope with every day life. Children with deficits in WM mechanisms have learning difficulties that are often accompanied by behavioral problems. The neural processes subserving WM, and brain structures underlying this system, continue to develop during childhood till adolescence and young adulthood. With functional magnetic resonance imaging (fMRI) it is possible to investigate the organization and development of WM. The present thesis aimed to investigate, using behavioral and neuroimaging methods, whether mnemonic processing of spatial and nonspatial visual information is segregated in the developing and mature human brain. A further aim in this research was to investigate the organization and development of audiospatial and visuospatial information processing in WM. The behavioral results showed that spatial and nonspatial visual WM processing is segregated in the adult brain. The fMRI result in children suggested that memory load related processing of spatial and nonspatial visual information engages common cortical networks, whereas selective attention to either type of stimuli recruits partially segregated areas in the frontal, parietal and occipital cortices. Deactivation mechanisms that are important in the performance of WM tasks in adults are already operational in healthy school-aged children. Electrophysiological evidence suggested segregated mnemonic processing of visual and auditory location information. The results of the development of audiospatial and visuospatial WM demonstrate that WM performance improves with age, suggesting functional maturation of underlying cognitive processes and brain areas. The development of the performance of spatial WM tasks follows a different time course in boys and girls indicating a larger degree of immaturity in the male than female WM systems. Furthermore, the differences in mastering auditory and visual WM tasks may indicate that visual WM reaches functional maturity earlier than the corresponding auditory system. Spatial WM deficits may underlie some learning difficulties and behavioral problems related to impulsivity, difficulties in concentration, and hyperactivity. Alternatively, anxiety or depressive symptoms may affect WM function and the ability to concentrate, being thus the primary cause of poor academic achievement in children.
Resumo:
Sleep deprivation leads to increased subsequent sleep length and depth and to deficits in cognitive performance in humans. In animals extreme sleep deprivation is eventually fatal. The cellular and molecular mechanisms causing the symptoms of sleep deprivation are unclear. This thesis was inspired by the hypothesis that during wakefulness brain energy stores would be depleted, and they would be replenished during sleep. The aim of this thesis was to elucidate the energy metabolic processes taking place in the brain during sleep deprivation. Endogenous brain energy metabolite levels were assessed in vivo in rats and in humans in four separate studies (Studies I-IV). In the first part (Study I) the effects of local energy depletion on brain energy metabolism and sleep were studied in rats with the use of in vivo microdialysis combined with high performance liquid chromatography. Energy depletion induced by 2,4-dinitrophenol infusion into the basal forebrain was comparable to the effects of sleep deprivation: both increased extracellular concentrations of adenosine, lactate, and pyruvate, and elevated subsequent sleep. This result supports the hypothesis of a connection between brain energy metabolism and sleep. The second part involved healthy human subjects (Studies II-IV). Study II aimed to assess the feasibility of applying proton magnetic resonance spectroscopy (1H MRS) to study brain lactate levels during cognitive stimulation. Cognitive stimulation induced an increase in lactate levels in the left inferior frontal gyrus, showing that metabolic imaging of neuronal activity related to cognition is possible with 1H MRS. Study III examined the effects of sleep deprivation and aging on the brain lactate response to cognitive stimulation. No physiologic, cognitive stimulation-induced lactate response appeared in the sleep-deprived and in the aging subjects, which can be interpreted as a sign of malfunctioning of brain energy metabolism. This malfunctioning may contribute to the functional impairment of the frontal cortex both during aging and sleep deprivation. Finally (Study IV), 1H MRS major metabolite levels in the occipital cortex were assessed during sleep deprivation and during photic stimulation. N-acetyl-aspartate (NAA/H2O) decreased during sleep deprivation, supporting the hypothesis of sleep deprivation-induced disturbance in brain energy metabolism. Choline containing compounds (Cho/H2O) decreased during sleep deprivation and recovered to alert levels during photic stimulation, pointing towards changes in membrane metabolism, and giving support to earlier observations of altered brain response to stimulation during sleep deprivation. Based on these findings, it can be concluded that sleep deprivation alters brain energy metabolism. However, the effects of sleep deprivation on brain energy metabolism may vary from one brain area to another. Although an effect of sleep deprivation might not in all cases be detectable in the non-stimulated baseline state, a challenge imposed by cognitive or photic stimulation can reveal significant changes. It can be hypothesized that brain energy metabolism during sleep deprivation is more vulnerable than in the alert state. Changes in brain energy metabolism may participate in the homeostatic regulation of sleep and contribute to the deficits in cognitive performance during sleep deprivation.
Resumo:
Drought during the pre-flowering stage can increase yield of peanut. There is limited information on genotypic variation for tolerance to and recovery from pre-flowering drought (PFD) and more importantly the physiological traits underlying genotypic variation. The objectives of this study were to determine the effects of moisture stress during the pre-flowering phase on pod yield and to understand some of the physiological responses underlying genotypic variation in response to and recovery from PFD. A glasshouse and field experiments were conducted at Khon Kaen University, Thailand. The glasshouse experiment was a randomized complete block design consisting of two watering regimes, i.e. fully-irrigated control and 1/3 available soil water from emergence to 40 days after emergence followed by adequate water supply, and 12 peanut genotypes. The field experiment was a split-plot design with two watering regimes as main-plots, and 12 peanut genotypes as sub-plots. Measurements of N-2 fixation, leaf area (LA) were made in both experiments. In addition, root growth was measured in the glasshouse experiment. Imposition of PFD followed by recovery resulted in an average increase in yield of 24 % (range from 10 % to 57 %) and 12 % (range from 2 % to 51 %) in the field and glasshouse experiments, respectively. Significant genotypic variation for N-2 fixation, LA and root growth was also observed after recovery. The study revealed that recovery growth following release of PFD had a stronger influence on final yield than tolerance to water deficits during the PFD. A combination of N-2 fixation, LA and root growth accounted for a major portion of the genotypic variation in yield (r = 0.68-0.93) suggesting that these traits could be used as selection criteria for identifying genotypes with rapid recovery from PFD. A combined analysis of glasshouse and field experiments showed that LA and N-2 fixation during the recovery had low genotype x environment interaction indicating potential for using these traits for selecting genotypes in peanut improvement programs.
Resumo:
Schizophrenia, affecting about 1% of population worldwide, is a severe mental disorder characterized by positive and negative symptoms, such as psychosis and anhedonia, as well as cognitive deficits. At present, schizophrenia is considered a complex disorder of neurodevelopmental origin with both genetic and environmental factors contributing to its onset. Although a number of candidate genes for schizophrenia have been highlighted, only very few schizophrenia patients are likely to share identical genetic liability. This study is based on the nation-wide schizophrenia family sample of the National Institute for Health and Welfare, and represents one of the largest and most well-characterized familial series in the world. In the first part of this study, we investigated the roles of the DTNBP1, NRG1, and AKT1 genes in the background of schizophrenia in Finland. Although these genes are associated with schizophrenia liability in several populations, any significant association with clinical diagnostic information of schizophrenia remained absent in our sample of 441 schizophrenia families. In the second part of this study, we first replicated schizophrenia linkage on the long arm of chromosome 7 in 352 schizophrenia families. In the following association analysis, we utilized additional clinical disorder features and intermediate phenotypes – endophenotypes - in addition to diagnostic information from altogether 290 neuropsychologically assessed schizophrenia families. An intragenic short tandem repeat allele of the regional RELN gene, supposed to play a role in the background of several neurodevelopmental disorders, showed significant association with poorer cognitive functioning and more severe schizophrenia symptoms. Additionally, this risk allele was significantly more prevalent among the individuals affected with schizophrenia spectrum disorders. We have previously identified linkage of schizophrenia and its cognitive endophenotypes on the long arms of chromosomes 2, 4, and 5. In the last part of this study, we selected altogether 104 functionally relevant candidate genes from the linked regions. We detected several promising associations, of which especially interesting are the ERBB4 gene, showing association with the severity of schizophrenia symptoms and impairments in traits related to verbal abilities, and the GRIA1 gene, showing association with the severity of schizophrenia symptoms. Our results extend the previous evidence that the genetic risk for schizophrenia is at least partially mediated via the effects of the candidate genes and their combinations on relevant brain systems, resulting in alterations in different disorder domains, such as the cognitive deficits.
Resumo:
Schizophrenia is a severe psychotic disorder affecting 0.5-1 % of the population. The disorder is characterized by hallucinations; delusions; disorganized behavior and speech; avolition; anhedonia; flattened affect and cognitive deficits. The etiology of the disorder is complex with evidence for multiple genes contributing to the onset of the disorder along with environmental factors. DISC1 is one of the most promising candidate genes for schizophrenia. It codes for a protein which takes part in numerous molecular interactions along several pathways. This network, termed as the DISC1 pathway, is evidently important for the development and maturation of the central nervous system from the embryo until young adulthood. Disruption at these pathways is thought to predispose schizophrenia. In the present study, we have studied the DISC1 pathway in the etiology of schizophrenia in the Finnish population. We have utilized large Finnish samples; the schizophrenia family sample where DISC1 was originally shown to associate with schizophrenia and the Northern Finland birth cohort 1966 (NFBC66). Several DISC1 binding partners displayed evidence for association in the family sample along with DISC1. Through a genome-wide linkage study, we found a significant linkage signal to a locus where a DISC1 binding partner NDE1 is located at the carriers of a certain DISC1 risk variant. In a follow-up study, genetic markers in NDE1 displayed significant evidence for association with schizophrenia. Further exploration of association between 11 genes of the DISC1 pathway and schizophrenia led to recognition of novel variants in NDEL1, PDE4B and PDE4D that significantly either increased or decreased the risk for schizophrenia. Further, we found evidence that DISC1 itself has a significant role in the human mental functioning even in the healthy population. Variants in DISC1 had a significant effect on anhedonia which is a trait present at everybody but is in its severe form one of the main symptoms of schizophrenia and correlates with the risk of developing the disorder. Further, utilizing genome-wide marker data, we recognized three genes; MIR620; CCDC141 and LCT; that are closely related to the DISC1 pathway but which effects on anhedonia were observable only at the individuals who carried these specific DISC1 variants. Our findings significantly add up to the previous evidence for the involvement of DISC1 and the DISC1 pathway in the etiology of schizophrenia and psychosis. Our results support the concept of a number of DISC1 pathway related genes contributing in the etiology of schizophrenia along with DISC1 and provide new candidates for the studies of schizophrenia. Our findings also significantly increase the importance of DISC1 itself as having a role in psychological functioning in the general population.
Resumo:
The accumulation of deficits with increasing age results in a decline in the functional capacity of multiple organs and systems. These changes can have a significant influence on the pharmacokinetics and pharmacodynamics of prescribed drugs. Although alterations in body composition and worsening renal clearance are important considerations, for most drugs the liver has the greatest effect on metabolism. Age-related change in hepatic function thereby causes much of the variability in older people’s responses to medication. In this review, we propose that a decline in the ability of the liver to inactivate toxins may contribute to a proinflammatory state in which frailty can develop. Since inflammation also downregulates drug metabolism, medication prescribed to frail older people in accordance with disease-specific guidelines may undergo reduced systemic clearance, leading to adverse drug reactions, further functional decline and increasing polypharmacy, exacerbating rather than ameliorating frailty status. We also describe how increasing chronological age and frailty status impact liver size, blood flow and protein binding and enzymes of drug metabolism. This is used to contextualise our discussion of appropriate prescribing practices. For example, while the general axiom of ‘start low, go slow’ should underpin the initiation of medication (titrating to a defined therapeutic goal), it is important to consider whether drug clearance is flow or capacity-limited. By summarising the effect of age-related changes in hepatic function on medications commonly used in older people, we aim to provide a guide that will have high clinical utility for practising geriatricians.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.
Resumo:
PURPOSE: In vivo corneal confocal microscopy (CCM) is increasingly used as a surrogate endpoint in studies of diabetic polyneuropathy (DPN). However, it is not clear whether imaging the central cornea provides optimal diagnostic utility for DPN. Therefore, we compared nerve morphology in the central cornea and the inferior whorl, a more distal and densely innervated area located inferior and nasal to the central cornea. METHODS: A total of 53 subjects with type 1/type 2 diabetes and 15 age-matched control subjects underwent detailed assessment of neuropathic symptoms (NPS), deficits (neuropathy disability score [NDS]), quantitative sensory testing (vibration perception threshold [VPT], cold and warm threshold [CT/WT], and cold- and heat-induced pain [CIP/HIP]), and electrophysiology (sural and peroneal nerve conduction velocity [SSNCV/PMNCV], and sural and peroneal nerve amplitude [SSNA/PMNA]) to diagnose patients with (DPN+) and without (DPN-) neuropathy. Corneal nerve fiber density (CNFD) and length (CNFL) in the central cornea, and inferior whorl length (IWL) were quantified. RESULTS: Comparing control subjects to DPN- and DPN+ patients, there was a significant increase in NDS (0 vs. 2.6 ± 2.3 vs. 3.3 ± 2.7, P < 0.01), VPT (V; 5.4 ± 3.0 vs. 10.6 ± 10.3 vs. 17.7 ± 11.8, P < 0.01), WT (°C; 37.7 ± 3.5 vs. 39.1 ± 5.1 vs. 41.7 ± 4.7, P < 0.05), and a significant decrease in SSNCV (m/s; 50.2 ± 5.4 vs. 48.4 ± 5.0 vs. 39.5 ± 10.6, P < 0.05), CNFD (fibers/mm2; 37.8 ± 4.9 vs. 29.7 ± 7.7 vs. 27.1 ± 9.9, P < 0.01), CNFL (mm/mm2; 27.5 ± 3.6 vs. 24.4 ± 7.8 vs. 20.7 ± 7.1, P < 0.01), and IWL (mm/mm2; 35.1 ± 6.5 vs. 26.2 ± 10.5 vs. 23.6 ± 11.4, P < 0.05). For the diagnosis of DPN, CNFD, CNFL, and IWL achieved an area under the curve (AUC) of 0.75, 0.74, and 0.70, respectively, and a combination of IWL-CNFD achieved an AUC of 0.76. CONCLUSIONS: The parameters of CNFD, CNFL, and IWL have a comparable ability to diagnose patients with DPN. However, IWL detects an abnormality even in patients without DPN. Combining IWL with CNFD may improve the diagnostic performance of CCM.
Resumo:
Background Data describing the Australian allied health workforce is inadequate and so insufficient for workforce planning. National health policy reform requires that health-care models take into account future workforce requirements, the distribution and work contexts of existing practitioners, training needs, workforce roles and scope of practice. Good information on this workforce is essential for managing services as demands increase, accountability of practitioners, measurement of outcomes and benchmarking against other jurisdictions. A comprehensive data set is essential to underpin policy and planning to meet future health workforce needs. Discussion Some data on allied health professions is managed by the Australian Health Practitioner Regulation Agency; however, there is limited information regarding several core allied health professions. A global registration and accreditation scheme recognizing all allied health professions might provide safeguards and credibility for professionals and their clients. Summary Arguments are presented about inconsistencies and voids in the available information about allied health services. Remedying these information deficits is essential to underpin policy and planning for future health workforce needs. We make the case for a comprehensive national data set based on a broad and inclusive sampling process across the allied health population.
Resumo:
Impulsivity and hyperactivity share common ground with numerous mental disorders, including schizophrenia. Recently, a population-specific serotonin 2B (5-HT2B) receptor stop codon (ie, HTR2B Q20*) was reported to segregate with severely impulsive individuals, whereas 5-HT2B mutant (Htr2B−/−) mice also showed high impulsivity. Interestingly, in the same cohort, early-onset schizophrenia was more prevalent in HTR2B Q*20 carriers. However, the putative role of 5-HT2B receptor in the neurobiology of schizophrenia has never been investigated. We assessed the effects of the genetic and the pharmacological ablation of 5-HT2B receptors in mice subjected to a comprehensive series of behavioral test screenings for schizophrenic-like symptoms and investigated relevant dopaminergic and glutamatergic neurochemical alterations in the cortex and the striatum. Domains related to the positive, negative, and cognitive symptom clusters of schizophrenia were affected in Htr2B−/− mice, as shown by deficits in sensorimotor gating, in selective attention, in social interactions, and in learning and memory processes. In addition, Htr2B−/− mice presented with enhanced locomotor response to the psychostimulants dizocilpine and amphetamine, and with robust alterations in sleep architecture. Moreover, ablation of 5-HT2B receptors induced a region-selective decrease of dopamine and glutamate concentrations in the dorsal striatum. Importantly, selected schizophrenic-like phenotypes and endophenotypes were rescued by chronic haloperidol treatment. We report herein that 5-HT2B receptor deficiency confers a wide spectrum of antipsychotic-sensitive schizophrenic-like behavioral and psychopharmacological phenotypes in mice and provide first evidence for a role of 5-HT2B receptors in the neurobiology of psychotic disorders
Resumo:
This study aimed to determine: 1) the spatial patterns of hamstring activation during the Nordic hamstring exercise (NHE); 2) whether previously injured hamstrings display activation deficits during the NHE, and; 3) whether previously injured hamstrings exhibit altered cross-sectional area. Ten healthy, recreationally active males with a history of unilateral hamstring strain injury underwent functional magnetic resonance imaging (fMRI) of their thighs before and after 6 sets of 10 repetitions of the NHE. Transverse (T2) relaxation times of all hamstring muscles (biceps femoris long head, (BFlh); biceps femoris short head (BFsh); semitendinosus (ST); semimembranosus (SM)), were measured at rest and immediately after the NHE and cross-sectional area (CSA) was measured at rest. For the uninjured limb, the ST’s percentage increase in T2 with exercise was 16.8, 15.8 and 20.2% greater than the increases exhibited by the BFlh, BFsh and SM, respectively (p<0.002 for all). Previously injured hamstring muscles (n=10) displayed significantly smaller increases in T2 post-exercise than the homonymous muscles in the uninjured contralateral limb (mean difference -7.2%, p=0.001). No muscles displayed significant between limb differences in CSA. During the NHE, the ST is preferentially activated and previously injured hamstring muscles display chronic activation deficits compared to uninjured contralateral muscles.
Resumo:
Stereotypes about different groups persist in organizations. Employees from such groups may experience stereotype threat, or the concern that they are being judged on the basis of demeaning stereotypes about groups to which they belong. The goal of this focal article is to discuss whether stereotype threat is a useful construct for organizational psychology research and practice. To this end, we focus on consequences other than acute performance deficits in laboratory settings. In particular, we examine studies that highlight the effects of stereotype threat on intrapersonal outcomes (e.g., job attitudes), interpersonal outcomes (e.g., negotiation), and on the relationship between employees and their organization. The research reviewed suggests that stereotype threat is a potentially important phenomenon in organizations, but it also highlights the paucity of research in an organizational context. We provide suggestions for future research directions as well as for the prevention and amelioration of stereotype threat in the workplace.