911 resultados para Neuropsychological deficits


Relevância:

10.00% 10.00%

Publicador:

Resumo:

On-axis monochromatic higher-order aberrations increase with age. Few studies have been made of peripheral refraction along the horizontal meridian of older eyes, and none of their off-axis higher-order aberrations. We measured wave aberrations over the central 42°x32° visual field for a 5mm pupil in 10 young and 7 older emmetropes. Patterns of peripheral refraction were similar in the two groups. Coma increased linearly with field angle at a significantly higher rate in older than in young emmetropes (−0.018±0.007 versus −0.006±0.002 µm/deg). Spherical aberration was almost constant over the measured field in both age groups and mean values across the field were significantly higher in older than in young emmetropes (+0.08±0.05 versus +0.02±0.04 µm). Total root-mean-square and higher-order aberrations increased more rapidly with field angle in the older emmetropes. However, the limits to monochromatic peripheral retinal image quality are largely determined by the second-order aberrations, which do not change markedly with age, and under normal conditions the relative importance of the increased higher-order aberrations in older eyes is lessened by the reduction in pupil diameter with age. Therefore it is unlikely that peripheral visual performance deficits observed in normal older individuals are primarily attributable to the increased impact of higher-order aberration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the pros and cons of using Behavioural cloning for the development of low-level helicopter automation modules. Over the course of this project several Behavioural cloning approaches have been investigated. The results of the most effective Behavioural cloning approach are then compared to PID modules designed for the same aircraft. The comparison takes into consideration development time, reliability, and control performance. It has been found that Behavioural cloning techniques employing local approximators and a wide state-space coverage during training can produce stabilising control modules in less time than tuning PID controllers. However, performance and reliabity deficits have been found to exist with the Behavioural Cloning, attributable largely to the time variant nature of the dynamics due to the operating environment, and the pilot actions being poor for teaching. The final conclusion drawn here is that tuning PID modules remains superior to behavioural cloning for low-level helicopter automation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Children with early and continuously treated phenylketonuria (ECT-PKU) remain at risk of developing executive function (EF) deficits. There is some evidence that a high phenylalanine to tyrosine ratio (phe:tyr) is more strongly associated with impaired EF development than high phenylalanine alone. This study examined EF in a sample of 11 adolescents against concurrent and historical levels of phenylalanine, phe:tyr, and tyrosine. Lifetime measures of phe:tyr were more strongly associated with EF than phenylalanine-only measures. Children with a lifetime phe:tyr less than 6 demonstrated normal EF, whereas children who had a lifetime phe:tyr above 6, on average, demonstrated clinically impaired EF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This book disseminates current information pertaining to the modulatory effects of foods and other food substances on behavior and neurological pathways and, importantly, vice versa. This ranges from the neuroendocrine control of eating to the effects of life-threatening disease on eating behavior. The importance of this contribution to the scientific literature lies in the fact that food and eating are an essential component of cultural heritage but the effects of perturbations in the food/cognitive axis can be profound. The complex interrelationship between neuropsychological processing, diet, and behavioral outcome is explored within the context of the most contemporary psychobiological research in the area. This comprehensive psychobiology- and pathology-themed text examines the broad spectrum of diet, behavioral, and neuropsychological interactions from normative function to occurrences of severe and enduring psychopathological processes

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Longitudinal data, where data are repeatedly observed or measured on a temporal basis of time or age provides the foundation of the analysis of processes which evolve over time, and these can be referred to as growth or trajectory models. One of the traditional ways of looking at growth models is to employ either linear or polynomial functional forms to model trajectory shape, and account for variation around an overall mean trend with the inclusion of random eects or individual variation on the functional shape parameters. The identification of distinct subgroups or sub-classes (latent classes) within these trajectory models which are not based on some pre-existing individual classification provides an important methodology with substantive implications. The identification of subgroups or classes has a wide application in the medical arena where responder/non-responder identification based on distinctly diering trajectories delivers further information for clinical processes. This thesis develops Bayesian statistical models and techniques for the identification of subgroups in the analysis of longitudinal data where the number of time intervals is limited. These models are then applied to a single case study which investigates the neuropsychological cognition for early stage breast cancer patients undergoing adjuvant chemotherapy treatment from the Cognition in Breast Cancer Study undertaken by the Wesley Research Institute of Brisbane, Queensland. Alternative formulations to the linear or polynomial approach are taken which use piecewise linear models with a single turning point, change-point or knot at a known time point and latent basis models for the non-linear trajectories found for the verbal memory domain of cognitive function before and after chemotherapy treatment. Hierarchical Bayesian random eects models are used as a starting point for the latent class modelling process and are extended with the incorporation of covariates in the trajectory profiles and as predictors of class membership. The Bayesian latent basis models enable the degree of recovery post-chemotherapy to be estimated for short and long-term followup occasions, and the distinct class trajectories assist in the identification of breast cancer patients who maybe at risk of long-term verbal memory impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate the meaning and understanding of domestic food preparation within the lived experience of the household's main food preparer this ethnographic study used a combination of qualitative and quantitative methodologies. Data were collected from three sources: the literature; an in-store survey of251 food shoppers chosen at random while shopping during both peak and off peak shopping periods at metropolitan supermarkets; and semi-structured interviews with the principal food shopper and food preparer of 15 different Brisbane households. Male and female respondents representing a cross section of socio-economic groupings, ranged in age from 19-79 years and were all from English speaking backgrounds. Changes in paid labour force participation, income and education have increased the value of the respondents' time, instigating massive changes in the way they shop, cook and eat. Much of their food preparation has moved from the domestic kitchen into the kitchens of other food establishments. For both sexes, the dominant motivating force behind these changes is a combination of the their self perceived lack of culinary skill; lack of enjoyment of cooking and lack of motivation to cook. The females in paid employment emphasise all factors, particularly the latter two, significantly more than the non-employed females. All factors are of increasing importance for individuals aged less than 35 years and conversely, of significantly diminished importance to older respondents. Overall, it is the respondents aged less than 25 years who indicate the lowest cooking frequency and/or least cooking ability. Inherent in this latter group is an indifference to the art/practice of preparing food. Increasingly, all respondents want to do less cooking and/or get the cooking over with as quickly as possible. Convenience is a powerful lure by which to spend less time in the kitchen. As well, there is an apparent willingness to pay a premium for convenience. Because children today are increasingly unlikely to be taught to cook, addressing the food skills deficit and encouraging individuals to cook for themselves are significant issues confronting health educators. These issues are suggested as appropriate subjects of future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the current study, we tested whether school connectedness mediates more distal deficits in social skills in influencing depressive symptoms in a sample of 127 sixth- and seventh-grade students. Results demonstrated that school connectedness and social skills accounted for 44% and 26% of variance in depressive symptoms respectively and 49% in a combined model. Although the full mediation model hypothesis was not supported, follow-up analyses revealed that school connectedness partially mediated the link between social skills and preadolescent depressive symptoms. Thus, school connectedness appears to play as strong a role in depressive symptoms in this younger preadolescent age group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Patients with idiopathic small fibre neuropathy (ISFN) have been shown to have significant intraepidermal nerve fibre loss and an increased prevalence of impaired glucose tolerance (IGT). It has been suggested that the dysglycemia of IGT and additional metabolic risk factors may contribute to small nerve fibre damage in these patients. Twenty-five patients with ISFN and 12 aged-matched control subjects underwent a detailed evaluation of neuropathic symptoms, neurological deficits (Neuropathy deficit score (NDS); Nerve Conduction Studies (NCS); Quantitative Sensory Testing (QST) and Corneal Confocal Microscopy (CCM)) to quantify small nerve fibre pathology. Eight (32%) patients had IGT. Whilst all patients with ISFN had significant neuropathic symptoms, NDS, NCS and QST except for warm thresholds were normal. Corneal sensitivity was reduced and CCM demonstrated a significant reduction in corneal nerve fibre density (NFD) (Pb0.0001), nerve branch density (NBD) (Pb0.0001), nerve fibre length (NFL) (Pb0.0001) and an increase in nerve fibre tortuosity (NFT) (Pb0.0001). However these parameters did not differ between ISFN patients with and without IGT, nor did they correlate with BMI, lipids and blood pressure. Corneal confocal microscopy provides a sensitive non-invasive means to detect small nerve fibre damage in patients with ISFN and metabolic abnormalities do not relate to nerve damage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although current assessments of agricultural management practices on soil organic C (SOC) dynamics are usually conducted without any explicit consideration of limits to soil C storage, it has been hypothesized that the SOC pool has an upper, or saturation limit with respect to C input levels at steady state. Agricultural management practices that increase C input levels over time produce a new equilibrium soil C content. However, multiple C input level treatments that produce no increase in SOC stocks at equilibrium show that soils have become saturated with respect to C inputs. SOC storage of added C input is a function of how far a soil is from saturation level (saturation deficit) as well as C input level. We tested experimentally if C saturation deficit and varying C input levels influenced soil C stabilization of added C-13 in soils varying in SOC content and physiochemical characteristics. We incubated for 2.5 years soil samples from seven agricultural sites that were closer to (i.e., A-horizon) or further from (i.e., C-horizon) their C saturation limit. At the initiation of the incubations, samples received low or high C input levels of 13 C-labeled wheat straw. We also tested the effect of Ca addition and residue quality on a subset of these soils. We hypothesized that the proportion of C stabilized would be greater in samples with larger C Saturation deficits (i.e., the C- versus A-horizon samples) and that the relative stabilization efficiency (i.e., Delta SCC/Delta C input) would decrease as C input level increased. We found that C saturation deficit influenced the stabilization of added residue at six out of the seven sites and C addition level affected the stabilization of added residue in four sites, corroborating both hypotheses. Increasing Ca availability or decreasing residue quality had no effect on the stabilization of added residue. The amount of new C stabilized was significantly related to C saturation deficit, supporting the hypothesis that C saturation influenced C stabilization at all our sites. Our results suggest that soils with low C contents and degraded lands may have the greatest potential and efficiency to store added C because they are further from their saturation level. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive summary Objective: The aims of this study were to identify the impact of Pandemic (H1N1) 2009 Influenza on Australian Emergency Departments (EDs) and their staff, and to inform planning, preparedness, and response management arrangements for future pandemics, as well as managing infectious patients presenting to EDs in everyday practice. Methods This study involved three elements: 1. The first element of the study was an examination of published material including published statistics. Standard literature research methods were used to identify relevant published articles. In addition, data about ED demand was obtained from Australian Government Department of Health and Ageing (DoHA) publications, with several state health departments providing more detailed data. 2. The second element of the study was a survey of Directors of Emergency Medicine identified with the assistance of the Australasian College for Emergency Medicine (ACEM). This survey retrieved data about demand for ED services and elicited qualitative comments on the impact of the pandemic on ED management. 3. The third element of the study was a survey of ED staff. A questionnaire was emailed to members of three professional colleges—the ACEM; the Australian College of Emergency Nursing (ACEN); and the College of Emergency Nursing Australasia (CENA). The overall response rate for the survey was 18.4%, with 618 usable responses from 3355 distributed questionnaires. Topics covered by the survey included ED conditions during the (H1N1) 2009 influenza pandemic; information received about Pandemic (H1N1) 2009 Influenza; pandemic plans; the impact of the pandemic on ED staff with respect to stress; illness prevention measures; support received from others in work role; staff and others’ illness during the pandemic; other factors causing ED staff to miss work during the pandemic; and vaccination against Pandemic (H1N1) 2009 Influenza. Both qualitative and quantitative data were collected and analysed. Results: The results obtained from Directors of Emergency Medicine quantifying the impact of the pandemic were too limited for interpretation. Data sourced from health departments and published sources demonstrated an increase in influenza-like illness (ILI) presentations of between one and a half and three times the normal level of presentations of ILIs. Directors of Emergency Medicine reported a reasonable level of preparation for the pandemic, with most reporting the use of pandemic plans that translated into relatively effective operational infection control responses. Directors reported a highly significant impact on EDs and their staff from the pandemic. Growth in demand and related ED congestion were highly significant factors causing distress within the departments. Most (64%) respondents established a ‘flu clinic’ either as part of Pandemic (H1N1) 2009 Influenza Outbreak in Australia: Impact on Emergency Departments. the ED operations or external to it. They did not note a significantly higher rate of sick leave than usual. Responses relating to the impact on staff were proportional to the size of the colleges. Most respondents felt strongly that Pandemic (H1N1) 2009 Influenza had a significant impact on demand in their ED, with most patients having low levels of clinical urgency. Most respondents felt that the pandemic had a negative impact on the care of other patients, and 94% revealed some increase in stress due to lack of space for patients, increased demand, and filling staff deficits. Levels of concern about themselves or their family members contracting the illness were less significant than expected. Nurses displayed significantly higher levels of stress overall, particularly in relation to skill-mix requirements, lack of supplies and equipment, and patient and patients’ family aggression. More than one-third of respondents became ill with an ILI. Whilst respondents themselves reported taking low levels of sick leave, respondents cited difficulties with replacing absent staff. Ranked from highest to lowest, respondents gained useful support from ED colleagues, ED administration, their hospital occupational health department, hospital administration, professional colleges, state health department, and their unions. Respondents were generally positive about the information they received overall; however, the volume of information was considered excessive and sometimes inconsistent. The media was criticised as scaremongering and sensationalist and as being the cause of many unnecessary presentations to EDs. Of concern to the investigators was that a large proportion (43%) of respondents did not know whether a pandemic plan existed for their department or hospital. A small number of staff reported being redeployed from their usual workplace for personal risk factors or operational reasons. As at the time of survey (29 October –18 December 2009), 26% of ED staff reported being vaccinated against Pandemic (H1N1) 2009 Influenza. Of those not vaccinated, half indicated they would ‘definitely’ or ‘probably’ not get vaccinated, with the main reasons being the vaccine was ‘rushed into production’, ‘not properly tested’, ‘came out too late’, or not needed due to prior infection or exposure, or due to the mildness of the disease. Conclusion: Pandemic (H1N1) 2009 Influenza had a significant impact on Australian Emergency Departments. The pandemic exposed problems in existing plans, particularly a lack of guidelines, general information overload, and confusion due to the lack of a single authoritative information source. Of concern was the high proportion of respondents who did not know if their hospital or department had a pandemic plan. Nationally, the pandemic communication strategy needs a detailed review, with more engagement with media networks to encourage responsible and consistent reporting. Also of concern was the low level of immunisation, and the low level of intention to accept vaccination. This is a problem seen in many previous studies relating to seasonal influenza and health care workers. The design of EDs needs to be addressed to better manage infectious patients. Significant workforce issues were confronted in this pandemic, including maintaining appropriate staffing levels; staff exposure to illness; access to, and appropriate use of, personal protective equipment (PPE); and the difficulties associated with working in PPE for prolonged periods. An administrative issue of note was the reporting requirement, which created considerable additional stress for staff within EDs. Peer and local support strategies helped ensure staff felt their needs were provided for, creating resilience, dependability, and stability in the ED workforce. Policies regarding the establishment of flu clinics need to be reviewed. The ability to create surge capacity within EDs by considering staffing, equipment, physical space, and stores is of primary importance for future pandemics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Flickering stimuli increase the metabolic demand of the retina,making it a sensitive perimetric stimulus to the early onset of retinal disease. We determine whether flickering stimuli are a sensitive indicator of vision deficits resulting from to acute, mild systemic hypoxia when compared to standard static perimetry. Methods: Static and flicker visual perimetry were performed in 14 healthy young participants while breathing 12% oxygen (hypoxia) under photopic illumination. The hypoxia visual field data were compared with the field data measured during normoxia. Absolute sensitivities (in dB) were analysed in seven concentric rings at 1°, 3°, 6°, 10°, 15°, 22° and 30° eccentricities as well as mean defect (MD) and pattern defect (PD) were calculated. Preliminary data are reported for mesopic light levels. Results: Under photopic illumination, flicker and static visual field sensitivities at all eccentricities were not significantly different between hypoxia and normoxia conditions. The mean defect and pattern defect were not significantly different for either test between the two oxygenation conditions. Conclusion: Although flicker stimulation increases cellular metabolism, flicker photopic visual field impairment is not detected during mild hypoxia. These findings contrast with electrophysiological flicker tests in young participants that show impairment at photopic illumination during the same levels of mild hypoxia. Potential mechanisms contributing to the difference between the visual fields and electrophysiological flicker tests including variability in perimetric data, neuronal adaptation and vascular autoregulation, are considered. The data have implications for the use of visual perimetry in the detection of ischaemic/hypoxic retinal disorders under photopic and mesopic light levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To evaluate whether luminance contrast discrimination losses in amblyopia on putative magnocellular (MC) and parvocellular (PC) pathway tasks reflect deficits at retinogeniculate or cortical sites. Fifteen amblyopes including six anisometropes, seven strabismics, two mixed and 12 age-matched controls were investigated. Contrast discrimination was measured using established psychophysical procedures that differentiate MC and PC processing. Data were described with a model of the contrast response of primate retinal ganglion cells. All amblyopes and controls displayed the same contrast signatures on the MC and PC tasks, with three strabismics having reduced sensitivity. Amblyopic PC contrast gain was similar to electrophysiological estimates from visually normal, non-human primates. Sensitivity losses evident in a subset of the amblyopes reflect cortical summation deficits, with no change in retinogeniculate contrast responses. The data do not support the proposal that amblyopic contrast sensitivity losses on MC and PC tasks reflect retinogeniculate deficits, but rather are due to anomalous post-retinogeniculate cortical processing of retinal signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hazard perception in driving is the one of the few driving-specific skills associated with crash involvement. However, this relationship has only been examined in studies where the majority of individuals were younger than 65. We present the first data revealing an association between hazard perception and self-reported crash involvement in drivers aged 65 and over. In a sample of 271 drivers, we found that individuals whose mean response time to traffic hazards was slower than 6.68 seconds (the ROC-curve derived pass mark for the test) were 2.32 times (95% CI 1.46, 3.22) more likely to have been involved in a self-reported crash within the previous five years than those with faster response times. This likelihood ratio became 2.37 (95% CI 1.49, 3.28) when driving exposure was controlled for. As a comparison, individuals who failed a test of useful field of view were 2.70 (95% CI 1.44, 4.44) times more likely to crash than those who passed. The hazard perception test and the useful field of view measure accounted for separate variance in crash involvement. These findings indicate that hazard perception testing and training could be potentially useful for road safety interventions for this age group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To investigate the impact of different levels of simulated visual impairment on the cognitive test performance of older adults and to compare this with previous findings in younger adults. METHODS.: Cognitive performance was assessed in 30 visually normal, community-dwelling older adults (mean = 70.2 ± 3.9 years). Four standard cognitive tests were used including the Digit Symbol Substitution Test, Trail Making Tests A and B, and the Stroop Color Word Test under three visual conditions: normal baseline vision and two levels of cataract simulating filters (Vistech), which were administered in a random order. Distance high-contrast visual acuity and Pelli-Robson letter contrast sensitivity were also assessed for all three visual conditions. RESULTS.: Simulated cataract significantly impaired performance across all cognitive test performance measures. In addition, the impact of simulated cataract was significantly greater in this older cohort than in a younger cohort previously investigated. Individual differences in contrast sensitivity better predicted cognitive test performance than did visual acuity. CONCLUSIONS.: Visual impairment can lead to slowing of cognitive performance in older adults; these effects are greater than those observed in younger participants. This has important implications for neuropsychological testing of older populations who have a high prevalence of cataract.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated personal and social processes of adjustment at different stages of illness for individuals with brain tumour. A purposive sample of 18 participants with mixed tumour types (9 benign and 9 malignant) and 15 family caregivers was recruited from a neurosurgical practice and a brain tumour support service. In-depth semi-structured interviews focused on participants’ perceptions of their adjustment, including personal appraisals, coping and social support since their brain tumour diagnosis. Interview transcripts were analysed thematically using open, axial and selective coding techniques. The primary theme that emerged from the analysis entailed “key sense making appraisals”, which was closely related to the following secondary themes: (1) Interactions with those in the healthcare system, (2) reactions and support from the personal support network, and (3) a diversity of coping efforts. Adjustment to brain tumour involved a series of appraisals about the illness that were influenced by interactions with those in the healthcare system, reactions and support from people in their support network, and personal coping efforts. Overall, the findings indicate that adjustment to brain tumour is highly individualistic; however, some common personal and social processes are evident in how people make sense of and adapt to the illness over time. A preliminary framework of adjustment based on the present findings and its clinical relevance are discussed. In particular, it is important for health professionals to seek to understand and support individuals’ sense-making processes following diagnosis of brain tumour.