36 resultados para Errors and blunders, Literary


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of this thesis were to investigate the neuropsychological, neurophysiological, and cognitive contributors to mobility changes with increasing age. In a series of studies with adults aged 45-88 years, unsafe pedestrian behaviour and falls were investigated in relation to i) cognitive functions (including response time variability, executive function, and visual attention tests), ii) mobility assessments (including gait and balance and using motion capture cameras), iii) motor initiation and pedestrian road crossing behavior (using a simulated pedestrian road scene), iv) neuronal and functional brain changes (using a computer based crossing task with magnetoencephalography), and v) quality of life questionnaires (including fear of falling and restricted range of travel). Older adults are more likely to be fatally injured at the far-side of the road compared to the near-side of the road, however, the underlying mobility and cognitive processes related to lane-specific (i.e. near-side or far-side) pedestrian crossing errors in older adults is currently unknown. The first study explored cognitive, motor initiation, and mobility predictors of unsafe pedestrian crossing behaviours. The purpose of the first study (Chapter 2) was to determine whether collisions at the near-side and far-side would be differentially predicted by mobility indices (such as walking speed and postural sway), motor initiation, and cognitive function (including spatial planning, visual attention, and within participant variability) with increasing age. The results suggest that near-side unsafe pedestrian crossing errors are related to processing speed, whereas far-side errors are related to spatial planning difficulties. Both near-side and far-side crossing errors were related to walking speed and motor initiation measures (specifically motor initiation variability). The salient mobility predictors of unsafe pedestrian crossings determined in the above study were examined in Chapter 3 in conjunction with the presence of a history of falls. The purpose of this study was to determine the extent to which walking speed (indicated as a salient predictor of unsafe crossings and start-up delay in Chapter 2), and previous falls can be predicted and explained by age-related changes in mobility and cognitive function changes (specifically within participant variability and spatial ability). 53.2% of walking speed variance was found to be predicted by self-rated mobility score, sit-to-stand time, motor initiation, and within participant variability. Although a significant model was not found to predict fall history variance, postural sway and attentional set shifting ability was found to be strongly related to the occurrence of falls within the last year. Next in Chapter 4, unsafe pedestrian crossing behaviour and pedestrian predictors (both mobility and cognitive measures) from Chapter 2 were explored in terms of increasing hemispheric laterality of attentional functions and inter-hemispheric oscillatory beta power changes associated with increasing age. Elevated beta (15-35 Hz) power in the motor cortex prior to movement, and reduced beta power post-movement has been linked to age-related changes in mobility. In addition, increasing recruitment of both hemispheres has been shown to occur and be beneficial to perform similarly to younger adults in cognitive tasks (Cabeza, Anderson, Locantore, & McIntosh, 2002). It has been hypothesised that changes in hemispheric neural beta power may explain the presence of more pedestrian errors at the farside of the road in older adults. The purpose of the study was to determine whether changes in age-related cortical oscillatory beta power and hemispheric laterality are linked to unsafe pedestrian behaviour in older adults. Results indicated that pedestrian errors at the near-side are linked to hemispheric bilateralisation, and neural overcompensation post-movement, 4 whereas far-side unsafe errors are linked to not employing neural compensation methods (hemispheric bilateralisation). Finally, in Chapter 5, fear of falling, life space mobility, and quality of life in old age were examined to determine their relationships with cognition, mobility (including fall history and pedestrian behaviour), and motor initiation. In addition to death and injury, mobility decline (such as pedestrian errors in Chapter 2, and falls in Chapter 3) and cognition can negatively affect quality of life and result in activity avoidance. Further, number of falls in Chapter 3 was not significantly linked to mobility and cognition alone, and may be further explained by a fear of falling. The objective of the above study (Study 2, Chapter 3) was to determine the role of mobility and cognition on fear of falling and life space mobility, and the impact on quality of life measures. Results indicated that missing safe pedestrian crossing gaps (potentially indicating crossing anxiety) and mobility decline were consistent predictors of fear of falling, reduced life space mobility, and quality of life variance. Social community (total number of close family and friends) was also linked to life space mobility and quality of life. Lower cognitive functions (particularly processing speed and reaction time) were found to predict variance in fear of falling and quality of life in old age. Overall, the findings indicated that mobility decline (particularly walking speed or walking difficulty), processing speed, and intra-individual variability in attention (including motor initiation variability) are salient predictors of participant safety (mainly pedestrian crossing errors) and wellbeing with increasing age. More research is required to produce a significant model to explain the number of falls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1950s, pedagogical stylistics has been intrinsically linked with the teaching of written texts (and especially literary texts) to speakers of English as a second language. This is despite the fact that for decades many teachers have also structured their lessons in L1 classrooms to focus upon the linguistic features of literary texts as a means of enhancing their students’ understanding of literature and language. Recognizing that instructors in both L1 and L2 settings were often employing related pedagogical techniques without realizing that their colleagues in the other context were facing similar challenges, the PEDSIG group of the Poetics and Linguistics Association (PALA) has sought to add a theoretical dimension to research undertaken into practice in the stylistics classroom. Its goals, then, were: to establish a working definition of pedagogical stylistics; to identify the theoretical and pedagogical underpinnings of the discipline shared by L1 and L2 practitioners; to point if possible towards any emerging consensus on good practice. The group determined that the principal aim of stylistics in the classroom is to make students aware of language use within chosen texts, and that what characterizes pedagogical stylistics is classroom activities that are interactive between the text and the (student) reader. Preliminary findings, from a pilot study involving a poem by Langston Hughes, suggest that the process of improving students’ linguistic sensibilities must include greater emphasis upon the text as action: i.e. upon the mental processing which is such a proactive part of reading and interpretation; and how all of these elements – pragmatic and cognitive as well as linguistic – function within quite specific social and cultural contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On the basis of a transcribed French television corpus made of two news bulletins, two chat shows and one literary programme recorded in February 2003, this paper explores the claim that passé simple (PS) may still be used in prepared oral discourse (Pfister 1974). The corpus does not provide support for that use on television, but it seems to suggest a shift from temporal to aspectual features in French television talk: a perfective presentation prevails on a past presentation. This trend would need to be confirmed by a larger television corpus, tested in other types of oral discourse and tested on written corpora.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sexualidad y Escritura (1850-2000) is a collection of thirteen essays which focus on the complex relationship between gender and writing in Spain from 1850 to 2000. This collection aims to provide a specifically Spanish cultural and historical context to the study of gender and writing and to challenge the effectiveness and validity of applying and adapting some feminist theory (based mainly in French and Anglo literary traditions) to works by both male and female Spanish writers. The introduction sets the tone of the essays it contains by discussing the Gilbert and Guar’s concept of female authors anxiety of authorship, and the reasons why their notions of the male dominated writing profession does not necessarily apply to Spanish literature of the nineteenth century in particular. The notable presence and success of female writers during the Romantic period and the way in which they in effect managed to feminize the writing profession illustrates how very different the Spanish literary context is from French, English or American models. The editors state that, rather than needing to work up the courage to take up the pen and publish their works, the issue facing Spanish women writers during parts of the last 150 years has been how to either maintain or regain their authorial voice and their place in letters, fighting to keep their heads above the rising and falling tides of literary trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although much has been written about Mary Shelley's Frankenstein, the part played by Erasmus Darwin (1731-1802) has been almost entirely neglected. This is odd as, apart from some ghost stories, Dr Darwin is the one influence mentioned in both the 1816 and 1831 prefaces to the book. The present contribution aims to redress that omission. It aims to show that Darwin's ideas about spontaneous generation, his anti-establishment ideas, and his literary genius played a significant role in forming the 'dark and shapeless substance' surging in Mary Shelley's mind during the summer of 1816 and from which her tale of Gothic horror emerged. It is, however, ultimately ironic that Frankenstein, which warns against a too enthusiastic use of scientific knowledge, should have been partly inspired by one of the most optimistically forward-looking of all late eighteenth-century thinkers. © 2007 Institute of Materials, Minerals and Mining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report the case of a neologistic jargonaphasic and ask whether her target-related and abstruse neologisms are the result of a single deficit, which affects some items more severely than others, or two deficits: one to lexical access and the other to phonological encoding. We analyse both correct/incorrect performance and errors and apply both traditional and formal methods (maximum-likelihood estimation and model selection). All evidence points to a single deficit at the level of phonological encoding. Further characteristics are used to constrain the locus still further. V.S. does not show the type of length effect expected of a memory component, nor the pattern of errors associated with an articulatory deficit. We conclude that her neologistic errors can result from a single deficit at a level of phonological encoding that immediately follows lexical access where segments are represented in terms of their features. We do not conclude, however, that this is the only possible locus that will produce phonological errors in aphasia, or, indeed, jargonaphasia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mechanical, physical and chemical changes in the surface of commercial thin film metal evaporated magnetic recording media have been correlated to recording error and signal degradation measurements. Modified and adapted commercial Hi-8 video recorders have been used for sample generation whilst analytical techniques such as SXPS,IMS and SEM have been employed in the surface characterisation. The durability of the media was assessed through stop motion (still frame) and cycling tests, where error growth and signal degradation were measured as a function of running time. The tests were performed under ambient (22°C, 40% RH) and high humidity (22°C, 80% RH) conditions. Characterisation of the lubricant layer on each tape was performed through models based on XPS and angle resolved XPS. The lubricant thickness can significantly affect the durability and signal output level of a thin film tape and thus it is important that reliable quantification can be achieved. Various models were considered for determining the lubricant thickness although ultimately, the most suitable technique was deemed to be a model that assumed a uniform layer structure. In addition to thin film metal evaporated media, equivalent durability tests and surface analysis experiments were performed using a commercial metal particle tape in order that comparisons could be made between the two types of recording media. The signal performance of the thin film metal evaporated media was found to be quite different from that for the metal particle tape since dropout errors and signal degradation increased at a much earlier stage. Extensive surface analyses enabled the mechanisms responsible for media failure and error growth to be identified in the ME and MP tapes and these were found to result from cyclic stressing and fatigue on the immediate substrate of the media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is devoted to the tribology at the head~to~tape interface of linear tape recording systems, OnStream ADRTM system being used as an experimental platform, Combining experimental characterisation with computer modelling, a comprehensive picture of the mechanisms involved in a tape recording system is drawn. The work is designed to isolate the mechanisms responsible for the physical spacing between head and tape with the aim of minimising spacing losses and errors and optimising signal output. Standard heads-used in ADR current products-and prototype heads- DLC and SPL coated and dummy heads built from a AI203-TiC and alternative single-phase ceramics intended to constitute the head tape-bearing surface-are tested in controlled environment for up to 500 hours (exceptionally 1000 hours), Evidences of wear on the standard head are mainly observable as a preferential wear of the TiC phase of the AI203-TiC ceramic, The TiC grains are believed to delaminate due to a fatigue wear mechanism, a hypothesis further confirmed via modelling, locating the maximum von Mises equivalent stress at a depth equivalent to the TiC recession (20 to 30 nm). Debris of TiC delaminated residues is moreover found trapped within the pole-tip recession, assumed therefore to provide three~body abrasive particles, thus increasing the pole-tip recession. Iron rich stain is found over the cycled standard head surface (preferentially over the pole-tip and to a lesser extent over the TiC grains) at any environment condition except high temperature/humidity, where mainly organic stain was apparent, Temperature (locally or globally) affects staining rate and aspect; stain transfer is generally promoted at high temperature. Humidity affects transfer rate and quantity; low humidity produces, thinner stains at higher rate. Stain generally targets preferentially head materials with high electrical conductivity, i.e. Permalloy and TiC. Stains are found to decrease the friction at the head-to-tape interface, delay the TiC recession hollow-out and act as a protective soft coating reducing the pole-tip recession. This is obviously at the expense of an additional spacing at the head-to-tape interface of the order of 20 nm. Two kinds of wear resistant coating are tested: diamond like carbon (DLC) and superprotective layer (SPL), 10 nm and 20 to 40 nm thick, respectively. DLC coating disappears within 100 hours due possibly to abrasive and fatigue wear. SPL coatings are generally more resistant, particularly at high temperature and low humidity, possibly in relation with stain transfer. 20 nm coatings are found to rely on the substrate wear behaviour whereas 40 nm coatings are found to rely on the adhesive strength at the coating/substrate interface. These observations seem to locate the wear-driving forces 40 nm below the surface, hence indicate that for coatings in the 10 nm thickness range-· i,e. compatible with high-density recording-the substrate resistance must be taken into account. Single-phase ceramic as candidate for wear-resistant tape-bearing surface are tested in form of full-contour dummy-heads. The absence of a second phase eliminates the preferential wear observed at the AI203-TiC surface; very low wear rates and no evidence of brittle fracture are observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop set of novel Markov chain Monte Carlo algorithms for Bayesian smoothing of partially observed non-linear diffusion processes. The sampling algorithms developed herein use a deterministic approximation to the posterior distribution over paths as the proposal distribution for a mixture of an independence and a random walk sampler. The approximating distribution is sampled by simulating an optimized time-dependent linear diffusion process derived from the recently developed variational Gaussian process approximation method. Flexible blocking strategies are introduced to further improve mixing, and thus the efficiency, of the sampling algorithms. The algorithms are tested on two diffusion processes: one with double-well potential drift and another with SINE drift. The new algorithm's accuracy and efficiency is compared with state-of-the-art hybrid Monte Carlo based path sampling. It is shown that in practical, finite sample, applications the algorithm is accurate except in the presence of large observation errors and low observation densities, which lead to a multi-modal structure in the posterior distribution over paths. More importantly, the variational approximation assisted sampling algorithm outperforms hybrid Monte Carlo in terms of computational efficiency, except when the diffusion process is densely observed with small errors in which case both algorithms are equally efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern cosmopolitans are compulsive explorers in search of knowledge of world cultures; their role as translators of different languages enhances cross-cultural understanding. Defined as "world citizen", the cosmopolitan emerges as a habitual city-dweller whose existence coincides with the emergence of the modern metropolis. Whether as Kant's blueprint for "world peace" or Goethe's "world literature", this study of cosmo-politanism introduces profiles of authors and intellectuals whose contribution to German and Austrian literary culture spans the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The controversy surrounding the non-uniqueness of predictive gene lists (PGL) of small selected subsets of genes from very large potential candidates as available in DNA microarray experiments is now widely acknowledged 1. Many of these studies have focused on constructing discriminative semi-parametric models and as such are also subject to the issue of random correlations of sparse model selection in high dimensional spaces. In this work we outline a different approach based around an unsupervised patient-specific nonlinear topographic projection in predictive gene lists. Methods: We construct nonlinear topographic projection maps based on inter-patient gene-list relative dissimilarities. The Neuroscale, the Stochastic Neighbor Embedding(SNE) and the Locally Linear Embedding(LLE) techniques have been used to construct two-dimensional projective visualisation plots of 70 dimensional PGLs per patient, classifiers are also constructed to identify the prognosis indicator of each patient using the resulting projections from those visualisation techniques and investigate whether a-posteriori two prognosis groups are separable on the evidence of the gene lists. A literature-proposed predictive gene list for breast cancer is benchmarked against a separate gene list using the above methods. Generalisation ability is investigated by using the mapping capability of Neuroscale to visualise the follow-up study, but based on the projections derived from the original dataset. Results: The results indicate that small subsets of patient-specific PGLs have insufficient prognostic dissimilarity to permit a distinction between two prognosis patients. Uncertainty and diversity across multiple gene expressions prevents unambiguous or even confident patient grouping. Comparative projections across different PGLs provide similar results. Conclusion: The random correlation effect to an arbitrary outcome induced by small subset selection from very high dimensional interrelated gene expression profiles leads to an outcome with associated uncertainty. This continuum and uncertainty precludes any attempts at constructing discriminative classifiers. However a patient's gene expression profile could possibly be used in treatment planning, based on knowledge of other patients' responses. We conclude that many of the patients involved in such medical studies are intrinsically unclassifiable on the basis of provided PGL evidence. This additional category of 'unclassifiable' should be accommodated within medical decision support systems if serious errors and unnecessary adjuvant therapy are to be avoided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FULL TEXT: Like many people one of my favourite pastimes over the holiday season is to watch the great movies that are offered on the television channels and new releases in the movie theatres or catching up on those DVDs that you have been wanting to watch all year. Recently we had the new ‘Star Wars’ movie, ‘The Force Awakens’, which is reckoned to become the highest grossing movie of all time, and the latest offering from James Bond, ‘Spectre’ (which included, for the car aficionados amongst you, the gorgeous new Aston Martin DB10). It is always amusing to see how vision correction or eye injury is dealt with by movie makers. Spy movies and science fiction movies have a freehand to design aliens with multiples eyes on stalks or retina scanning door locks or goggles that can see through walls. Eye surgery is usually shown in some kind of day case simplified laser treatment that gives instant results, apart from the great scene in the original ‘Terminator’ movie where Arnold Schwarzenegger's android character encounters an injury to one eye and then proceeds to remove the humanoid covering to this mechanical eye over a bathroom sink. I suppose it is much more difficult to try and include contact lenses in such movies. Although you may recall the film ‘Charlie's Angels’, which did have a scene where one of the Angels wore a contact lens that had a retinal image imprinted on it so she could by-pass a retinal scan door lock and an Eddy Murphy spy movie ‘I-Spy’, where he wore contact lenses that had electronic gadgetry that allowed whatever he was looking at to be beamed back to someone else, a kind of remote video camera device. Maybe we aren’t quite there in terms of devices available but these things are probably not the behest of science fiction anymore as the technology does exist to put these things together. The technology to incorporate electronics into contact lenses is being developed and I am sure we will be reporting on it in the near future. In the meantime we can continue to enjoy the unrealistic scenes of eye swapping as in the film ‘Minority Report’ (with Tom Cruise). Much more closely to home, than in a galaxy far far away, in this issue you can find articles on topics much nearer to the closer future. More and more optometrists in the UK are becoming registered for therapeutic work as independent prescribers and the number is likely to rise in the near future. These practitioners will be interested in the review paper by Michael Doughty, who is a member of the CLAE editorial panel (soon to be renamed the Jedi Council!), on prescribing drugs as part of the management of chronic meibomian gland dysfunction. Contact lenses play an active role in myopia control and orthokeratology has been used not only to help provide refractive correction but also in the retardation of myopia. In this issue there are three articles related to this topic. Firstly, an excellent paper looking at the link between higher spherical equivalent refractive errors and the association with slower axial elongation. Secondly, a paper that discusses the effectiveness and safety of overnight orthokeratology with high-permeability lens material. Finally, a paper that looks at the stabilisation of early adult-onset myopia. Whilst we are always eager for new and exciting developments in contact lenses and related instrumentation in this issue of CLAE there is a demonstration of a novel and practical use of a smartphone to assisted anterior segment imaging and suggestions of this may be used in telemedicine. It is not hard to imagine someone taking an image remotely and transmitting that back to a central diagnostic centre with the relevant expertise housed in one place where the information can be interpreted and instruction given back to the remote site. Back to ‘Star Wars’ and you will recall in the film ‘The Phantom Menace’ when Qui-Gon Jinn first meets Anakin Skywalker on Tatooine he takes a sample of his blood and sends a scan of it back to Obi-Wan Kenobi to send for analysis and they find that the boy has the highest midichlorian count ever seen. On behalf of the CLAE Editorial board (or Jedi Council) and the BCLA Council (the Senate of the Republic) we wish for you a great 2016 and ‘may the contact lens force be with you’. Or let me put that another way ‘the CLAE Editorial Board and BCLA Council, on behalf of, a great 2016, we wish for you!’

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Golfers, coaches and researchers alike, have all keyed in on golf putting as an important aspect of overall golf performance. Of the three principle putting tasks (green reading, alignment and the putting action phase), the putting action phase has attracted the most attention from coaches, players and researchers alike. This phase includes the alignment of the club with the ball, the swing, and ball contact. A significant amount of research in this area has focused on measuring golfer’s vision strategies with eye tracking equipment. Unfortunately this research suffers from a number of shortcomings, which limit its usefulness. The purpose of this thesis was to address some of these shortcomings. The primary objective of this thesis was to re-evaluate golfer’s putting vision strategies using binocular eye tracking equipment and to define a new, optimal putting vision strategy which was associated with both higher skill and success. In order to facilitate this research, bespoke computer software was developed and validated, and new gaze behaviour criteria were defined. Additionally, the effects of training (habitual) and competition conditions on the putting vision strategy were examined, as was the effect of ocular dominance. Finally, methods for improving golfer’s binocular vision strategies are discussed, and a clinical plan for the optometric management of the golfer’s vision is presented. The clinical management plan includes the correction of fundamental aspects of golfers’ vision, including monocular refractive errors and binocular vision defects, as well as enhancement of their putting vision strategy, with the overall aim of improving performance on the golf course. This research has been undertaken in order to gain a better understanding of the human visual system and how it relates to the sport performance of golfers specifically. Ultimately, the analysis techniques and methods developed are applicable to the assessment of visual performance in all sports.