181 resultados para Wood Dale
Resumo:
Although placing reflective markers on pedestrians’ major joints can make pedestrians more conspicuous to drivers at night, it has been suggested that this “biological motion” effect may be reduced when visual clutter is present. We tested whether extraneous points of light affected the ability of 12 younger and 12 older drivers to see pedestrians as they drove on a closed road at night. Pedestrians wore black clothing alone or with retroreflective markings in four different configurations. One pedestrian walked in place and was surrounded by clutter on half of the trials. Another was always surrounded by visual clutter but either walked in place or stood still. Clothing configuration, pedestrian motion, and driver age influenced conspicuity but clutter did not. The results confirm that even in the presence of visual clutter pedestrians wearing biological motion configurations are recognized more often and at greater distances than when they wear a reflective vest.
Resumo:
Objectives: As the population ages, more people will be wearing presbyopic vision corrections when driving. However, little is known about the impact of these vision corrections on driving performance. This study aimed to determine the subjective driving difficulties experienced when wearing a range of common presbyopic contact lens and spectacle corrections.----- Methods: A questionnaire was developed and piloted that included a series of items regarding difficulties experienced while driving under daytime and night-time conditions (rated on five-point and seven-point Likert scales). Participants included 255 presbyopic patients recruited through local optometry practices. Participants were categorized into five age-matched groups; including those wearing no vision correction for driving (n = 50), bifocal spectacles (n = 54), progressive spectacles (n = 50), monovision contact lenses (n = 53), and multifocal contact lenses (n = 48).----- Results: Overall, ratings of satisfaction during daytime driving were relatively high for all correction types. However, multifocal contact lens wearers were significantly less satisfied with aspects of their vision during night-time than daytime driving, particularly regarding disturbances from glare and haloes. Progressive spectacle lens wearers noticed more distortion of peripheral vision, whereas bifocal spectacle wearers reported more difficulties with tasks requiring changes of focus and those who wore no optical correction for driving reported problems with intermediate and near tasks. Overall, satisfaction was significantly higher for progressive spectacles than bifocal spectacles for driving.----- Conclusions: Subjective visual experiences of different presbyopic vision corrections when driving vary depending on the vision tasks and lighting level. Eye-care practitioners should be aware of the driving-related difficulties experienced with each vision correction type and the need to select corrective types that match the driving needs of their patients.
Resumo:
Aims: This study investigated the effect of simulated visual impairment on the speed and accuracy of performance on a series of commonly used cognitive tests. ----- Methods: Cognitive performance was assessed for 30 young, visually normal subjects (M=22.0yrs ± 3.1 yrs) using the Digit Symbol Substitution Test (DSST), Trail Making Test (TMT) A and B and the Stroop Colour Word Test under three visual conditions: normal vision and two levels of visually degrading filters (VistechTM) administered in a random order. Distance visual acuity and contrast sensitivity were also assessed for each filter condition. ----- Results: The visual filters, which degraded contrast sensitivity to a greater extent than visual acuity, significantly increased the time to complete (p<0.05), but not the number of errors made, on the DSST and the TMT A and B and affected only some components of the Stroop test.----- Conclusions: Reduced contrast sensitivity had a marked effect on the speed but not the accuracy of performance on commonly used cognitive tests, even in young individuals; the implications of these findings are discussed.
Resumo:
Purpose. To investigate the functional impact of amblyopia in children, the performance of amblyopic and age-matched control children on a clinical test of eye movements was compared. The influence of visual factors on test outcome measures was explored. Methods. Eye movements were assessed with the Developmental Eye Movement (DEM) test, in a group of children with amblyopia (n = 39; age, 9.1 ± 0.9 years) of different causes (infantile esotropia, n = 7; acquired strabismus, n = 10; anisometropia, n = 8; mixed, n = 8; deprivation, n = 6) and in an age-matched control group (n = 42; age, 9.3 ± 0.4 years). LogMAR visual acuity (VA), stereoacuity, and refractive error were also recorded in both groups. Results. No significant difference was found between the amblyopic and age-matched control group for any of the outcome measures of the DEM (vertical time, horizontal time, number of errors and ratio(horizontal time/vertical time)). The DEM measures were not significantly related to VA in either eye, level of binocular function (stereoacuity), history of strabismus, or refractive error. Conclusions. The performance of amblyopic children on the DEM, a commonly used clinical measure of eye movements, has not previously been reported. Under habitual binocular viewing conditions, amblyopia has no effect on DEM outcome scores despite significant impairment of binocular vision and decreased VA in both the better and worse eye.
Resumo:
We examined differences in response latencies obtained during a validated video-based hazard perception driving test between three healthy, community-dwelling groups: 22 mid-aged (35-55 years), 34 young-old (65-74 years), and 23 old-old (75-84 years) current drivers, matched for gender, education level, and vocabulary. We found no significant difference in performance between mid-aged and young-old groups, but the old-old group was significantly slower than the other two groups. The differences between the old-old group and the other groups combined were independently mediated by useful field of view (UFOV), contrast sensitivity, and simple reaction time measures. Given that hazard perception latency has been linked with increased crash risk, these results are consistent with the idea that increased crash risk in older adults could be a function of poorer hazard perception, though this decline does not appear to manifest until age 75+ in healthy drivers.
Resumo:
Relationships between self-reported retrospective falls and cognitive measures (executive function, reaction time, processing speed, working memory, visual attention) were examined in a population based sample of older adults (n = 658). Two of the choice reaction time tests involved inhibiting responses to either targets of a specific color or location with hand and foot responses. Potentially confounding demographic variables, medical conditions and postural sway were controlled for in logistic regression models, excluding participants with possible cognitive impairment. A factor analysis of cognitive measures extracted factors measuring reaction time, accuracy and inhibition, and visual search. Single fallers did not differ from non-fallers in terms of health, sway or cognitive function, except that they performed worse on accuracy and inhibition. In contrast, recurrent fallers performed worse than non-fallers on all measures. Results suggest that occasional falls in late life may be associated with subtle age-related changes in the pre-frontal cortex leading to failures of executive control, whereas recurrent falling may result from more advanced brain ageing that is associated with generalized cognitive decline.
Resumo:
Definition of disease phenotype is a necessary preliminary to research into genetic causes of a complex disease. Clinical diagnosis of migraine is currently based on diagnostic criteria developed by the International Headache Society. Previously, we examined the natural clustering of these diagnostic symptoms using latent class analysis (LCA) and found that a four-class model was preferred. However, the classes can be ordered such that all symptoms progressively intensify, suggesting that a single continuous variable representing disease severity may provide a better model. Here, we compare two models: item response theory and LCA, each constructed within a Bayesian context. A deviance information criterion is used to assess model fit. We phenotyped our population sample using these models, estimated heritability and conducted genome-wide linkage analysis using Merlin-qtl. LCA with four classes was again preferred. After transformation, phenotypic trait values derived from both models are highly correlated (correlation = 0.99) and consequently results from subsequent genetic analyses were similar. Heritability was estimated at 0.37, while multipoint linkage analysis produced genome-wide significant linkage to chromosome 7q31-q33 and suggestive linkage to chromosomes 1 and 2. We argue that such continuous measures are a powerful tool for identifying genes contributing to migraine susceptibility.
Resumo:
Migraine is a painful disorder for which the etiology remains obscure. Diagnosis is largely based on International Headache Society criteria. However, no feature occurs in all patients who meet these criteria, and no single symptom is required for diagnosis. Consequently, this definition may not accurately reflect the phenotypic heterogeneity or genetic basis of the disorder. Such phenotypic uncertainty is typical for complex genetic disorders and has encouraged interest in multivariate statistical methods for classifying disease phenotypes. We applied three popular statistical phenotyping methods—latent class analysis, grade of membership and grade of membership “fuzzy” clustering (Fanny)—to migraine symptom data, and compared heritability and genome-wide linkage results obtained using each approach. Our results demonstrate that different methodologies produce different clustering structures and non-negligible differences in subsequent analyses. We therefore urge caution in the use of any single approach and suggest that multiple phenotyping methods be used.
Resumo:
In this research the reliability and availability of fiberboard pressing plant is assessed and a cost-based optimization of the system using the Monte- Carlo simulation method is performed. The woodchip and pulp or engineered wood industry in Australia and around the world is a lucrative industry. One such industry is hardboard. The pressing system is the main system, as it converts the wet pulp to fiberboard. The assessment identified the pressing system has the highest downtime throughout the plant plus it represents the bottleneck in the process. A survey in the late nineties revealed there are over one thousand plants around the world, with the pressing system being a common system among these plants. No work has been done to assess or estimate the reliability of such a pressing system; therefore this assessment can be used for assessing any plant of this type.
Resumo:
In the emerging literature related to destination branding, little has been reported about performance metrics. The focus of most research reported to date has been concerned with the development of destination brand identities and the implementation of campaigns (see for example, Crockett & Wood 1999, Hall 1999, May 2001, Morgan et al 2002). One area requiring increased attention is that of tracking the performance of destination brands over time. This is an important gap in the tourism literature, given: i) the increasing level of investment by destination marketing organisations (DMO) in branding since the 1990s, ii) the complex political nature of DMO brand decision-making and increasing accountability to stakeholders (see Pike, 2005), and iii) the long-term nature of repositioning a destination’s image in the market place (see Gartner & Hunt, 1987). Indeed, a number of researchers in various parts of the world have pointed to a lack of market research monitoring destination marketing objectives, such as in Australia (see Prosser et. al 2000, Carson, Beattie and Gove 2003), North America (Sheehan & Ritchie 1997, Masberg 1999), and Europe (Dolnicar & Schoesser 2003)...
Resumo:
Protection of “critical infrastructure” has become a major issue for govern- ments worldwide. Yet in Australia, as in many other countries, including the United States, an estimated 90% of critical infrastructure is privately owned or operated commercially – in other words, critical infrastructure protection is not the exclusive domain of government. As a result, information sharing between government and the private sector has become a vitally important component of effective risk management. However, establishing effective arrangements of this kind between the public and private sector needs to take account of existing regimes of access and public disclosure which relate to government-held documents; in particular, that which is established by freedom of information (FOI) legislation. This article examines the extent to which the current Commonwealth FOI regime is likely to act as an impediment to the private sector operators of critical infrastructure participat- ing in government-operated information sharing arrangements. By examining developments in other jurisdictions, principally the United States, the article considers whether amendments to the current Australian FOI regime are necessary to ensure effective participation, consistent with the underlying object and purpose of FOI.
Resumo:
Although systemic androgen deprivation prolongs life in advanced prostate cancer, remissions are temporary because patients almost uniformly progress to a state of a castration-resistant prostate cancer (CRPC) as indicated by recurring PSA. This complex process of progression does not seem to be stochastic as the timing and phenotype are highly predictable, including the observation that most androgen-regulated genes are reactivated despite castrate levels of serum androgens. Recent evidence indicates that intraprostatic levels of androgens remain moderately high following systemic androgen deprivation therapy, whereas the androgen receptor (AR) remains functional, and silencing the AR expression following castration suppresses tumor growth and blocks the expression of genes known to be regulated by androgens. From these observations, we hypothesized that CRPC progression is not independent of androgen-driven activity and that androgens may be synthesized de novo in CRPC tumors leading to AR activation. Using the LNCaP xenograft model, we showed that tumor androgens increase during CRPC progression in correlation to PSA up-regulation. We show here that all enzymes necessary for androgen synthesis are expressed in prostate cancer tumors and some seem to be up-regulated during CRPC progression. Using an ex vivo radiotracing assays coupled to high-performance liquid chromatography-radiometric/mass spectrometry detection, we show that tumor explants isolated from CRPC progression are capable of de novo conversion of [(14)C]acetic acid to dihydrotestosterone and uptake of [(3)H]progesterone allows detection of the production of six other steroids upstream of dihydrotestosterone. This evidence suggests that de novo androgen synthesis may be a driving mechanism leading to CRPC progression following castration.