272 resultados para Posterior Cornea
Resumo:
Purpose: To investigate the short term influence of imposed monocular defocus upon human optical axial length (the distance from anterior cornea to retinal pigment epithelium) and ocular biometrics. Methods: Twenty-eight young adult subjects (14 myopes and 14 emmetropes) had eye biometrics measured before and then 30 and 60 minutes after exposure to monocular (right eye) defocus. Four different monocular defocus conditions were tested, each on a separate day: control (no defocus), myopic (+3 D defocus), hyperopic (-3 D defocus) and diffuse (0.2 density Bangerter filter) defocus. The fellow eye was optimally corrected (no defocus). Results: Imposed defocus caused small but significant changes in optical axial length (p<0.0001). A significant increase in optical axial length (mean change +8 ± 14 μm, p=0.03) occurred following hyperopic defocus, and a significant reduction in optical axial length (mean change -13 ± 14 μm, p=0.0001) was found following myopic defocus. A small increase in optical axial length was observed following diffuse defocus (mean change +6 ± 13 μm, p=0.053). Choroidal thickness also exhibited some significant changes with certain defocus conditions. No significant difference was found between myopes and emmetropes in the changes in optical axial length or choroidal thickness with defocus. Conclusions: Significant changes in optical axial length occur in human subjects following 60 minutes of monocular defocus. The bi-directional optical axial length changes observed in response to defocus implies the human visual system is capable of detecting the presence and sign of defocus and altering optical axial length to move the retina towards the image plane.
Resumo:
Thomas Young (1773-1829) carried out major pioneering work in many different subjects. In 1800 he gave the Bakerian Lecture of the Royal Society on the topic of the “mechanism of the eye”: this was published in the following year (Young, 1801). Young used his own design of optometer to measure refraction and accommodation, and discovered his own astigmatism. He considered the different possible origins of accommodation and confirmed that it was due to change in shape of the lens rather than to change in shape of the cornea or an increase in axial length. However, the paper also dealt with many other aspects of visual and ophthalmic optics, such as biometric parameters, peripheral refraction, longitudinal chromatic aberration, depth-of-focus and instrument myopia. These aspects of the paper have previously received little attention. We now give detailed consideration to these and other less-familiar features of Young’s work and conclude that his studies remain relevant to many of the topics which currently engage visual scientists.
Resumo:
Purpose: The aim of this study was to investigate the capabilities of laser scanning confocal microscopy (LSCM) for undertaking qualitative and quantitative investigations of the response of the bulbar conjunctiva to contact lens wear. Methods: LSCM was used to observe and measure morphological characteristics of the bulbar conjunctiva of 11 asymptomatic soft contact lens wearers and 11 healthy volunteer subjects (controls). Results: The appearance of the bulbar conjunctiva is consistent with known histology of this tissue based on light and electron microscopy. The thickness of the bulbar conjunctival epithelium of lens wearers (30.9 ± 1.1 μm) was less than that of controls (32.9 ± 1.1 μm) (P < 0.0001). Superficial and basal bulbar conjunctival epithelial cell densities in contact lens wearers were 91% and 79% higher, respectively, than that in controls (P < 0.0001). No difference was observed in goblet and Langerhans cell density between lens wearers and controls. Conjunctival microcysts were observed in greater numbers, and were larger in size, in lens wearers compared with controls. Conclusions: The effects of contact lens wear on the human bulbar conjunctiva can be investigated effectively at a cellular level using LSCM. The observations in this study suggest that contact lens wear can induce changes in the bulbar conjunctiva such as epithelial thinning and accelerated formation and enlargement of microcysts, increased epithelial cell density, but has no impact on goblet or Langerhans cell density.
Resumo:
Aim/hypothesis Immune mechanisms have been proposed to play a role in the development of diabetic neuropathy. We employed in vivo corneal confocal microscopy (CCM) to quantify the presence and density of Langerhans cells (LCs) in relation to the extent of corneal nerve damage in Bowman's layer of the cornea in diabetic patients. Methods 128 diabetic patients aged 58±1 yrs with a differing severity of neuropathy based on Neuropathy Deficit Score (NDS—4.7±0.28) and 26 control subjects aged 53±3 yrs were examined. Subjects underwent a full neurological evaluation, evaluation of corneal sensation with non-contact corneal aesthesiometry (NCCA) and corneal nerve morphology using corneal confocal microscopy (CCM). Results The proportion of individuals with LCs was significantly increased in diabetic patients (73.8%) compared to control subjects (46.1%), P=0.001. Furthermore, LC density (no/mm2) was significantly increased in diabetic patients (17.73±1.45) compared to control subjects (6.94±1.58), P=0.001 and there was a significant correlation with age (r=0.162, P=0.047) and severity of neuropathy (r=−0.202, P=0.02). There was a progressive decrease in corneal sensation with increasing severity of neuropathy assessed using NDS in the diabetic patients (r=0.414, P=0.000). Corneal nerve fibre density (P<0.001), branch density (P<0.001) and length (P<0.001) were significantly decreased whilst tortuosity (P<0.01) was increased in diabetic patients with increasing severity of diabetic neuropathy. Conclusion Utilising in vivo corneal confocal microscopy we have demonstrated increased LCs in diabetic patients particularly in the earlier phases of corneal nerve damage suggestive of an immune mediated contribution to corneal nerve damage in diabetes.
Resumo:
Perhaps more than any other sub-discipline in optometry and vision science, the academic field of cornea and contact lenses is populated by an assortment of extroverted and flamboyant characters who constantly travel the world, entertaining clinicians with dazzling audiovisual presentations, informing them about the latest advances in the field and generally promoting their own scientific agendas. The antithesis of this is Leo Carney (Figure 1), a highly accomplished researcher, teacher, mentor and administrator, who has quietly and with great dignity carved out an impressive career in academic optometry. Indeed, Leo Carney is optometry's quintessential ‘quiet achiever’
Resumo:
For the first time in human history, large volumes of spoken audio are being broadcast, made available on the internet, archived, and monitored for surveillance every day. New technologies are urgently required to unlock these vast and powerful stores of information. Spoken Term Detection (STD) systems provide access to speech collections by detecting individual occurrences of specified search terms. The aim of this work is to develop improved STD solutions based on phonetic indexing. In particular, this work aims to develop phonetic STD systems for applications that require open-vocabulary search, fast indexing and search speeds, and accurate term detection. Within this scope, novel contributions are made within two research themes, that is, accommodating phone recognition errors and, secondly, modelling uncertainty with probabilistic scores. A state-of-the-art Dynamic Match Lattice Spotting (DMLS) system is used to address the problem of accommodating phone recognition errors with approximate phone sequence matching. Extensive experimentation on the use of DMLS is carried out and a number of novel enhancements are developed that provide for faster indexing, faster search, and improved accuracy. Firstly, a novel comparison of methods for deriving a phone error cost model is presented to improve STD accuracy, resulting in up to a 33% improvement in the Figure of Merit. A method is also presented for drastically increasing the speed of DMLS search by at least an order of magnitude with no loss in search accuracy. An investigation is then presented of the effects of increasing indexing speed for DMLS, by using simpler modelling during phone decoding, with results highlighting the trade-off between indexing speed, search speed and search accuracy. The Figure of Merit is further improved by up to 25% using a novel proposal to utilise word-level language modelling during DMLS indexing. Analysis shows that this use of language modelling can, however, be unhelpful or even disadvantageous for terms with a very low language model probability. The DMLS approach to STD involves generating an index of phone sequences using phone recognition. An alternative approach to phonetic STD is also investigated that instead indexes probabilistic acoustic scores in the form of a posterior-feature matrix. A state-of-the-art system is described and its use for STD is explored through several experiments on spontaneous conversational telephone speech. A novel technique and framework is proposed for discriminatively training such a system to directly maximise the Figure of Merit. This results in a 13% improvement in the Figure of Merit on held-out data. The framework is also found to be particularly useful for index compression in conjunction with the proposed optimisation technique, providing for a substantial index compression factor in addition to an overall gain in the Figure of Merit. These contributions significantly advance the state-of-the-art in phonetic STD, by improving the utility of such systems in a wide range of applications.
Resumo:
Background: Falls are a major health and injury problem for people with Parkinson disease (PD). Despite the severe consequences of falls, a major unresolved issue is the identification of factors that predict the risk of falls in individual patients with PD. The primary aim of this study was to prospectively determine an optimal combination of functional and disease-specific tests to predict falls in individuals with PD. ----- ----- Methods: A total of 101 people with early-stage PD undertook a battery of neurologic and functional tests in their optimally medicated state. The tests included Tinetti, Berg, Timed Up and Go, Functional Reach, and the Physiological Profile Assessment of Falls Risk; the latter assessment includes physiologic tests of visual function, proprioception, strength, cutaneous sensitivity, reaction time, and postural sway. Falls were recorded prospectively over 6 months. ----- ----- Results: Forty-eight percent of participants reported a fall and 24% more than 1 fall. In the multivariate model, a combination of the Unified Parkinson's Disease Rating Scale (UPDRS) total score, total freezing of gait score, occurrence of symptomatic postural orthostasis, Tinetti total score, and extent of postural sway in the anterior-posterior direction produced the best sensitivity (78%) and specificity (84%) for predicting falls. From the UPDRS items, only the rapid alternating task category was an independent predictor of falls. Reduced peripheral sensation and knee extension strength in fallers contributed to increased postural instability. ----- ----- Conclusions: Falls are a significant problem in optimally medicated early-stage PD. A combination of both disease-specific and balance- and mobility-related measures can accurately predict falls in individuals with PD.
Resumo:
Purpose. The objective of this study was to explore the discriminative capacity of non-contact corneal esthesiometry (NCCE) when compared with the neuropathy disability score (NDS) score—a validated, standard method of diagnosing clinically significant diabetic neuropathy. Methods. Eighty-one participants with type 2 diabetes, no history of ocular disease, trauma, or surgery and no history of systemic disease that may affect the cornea were enrolled. Participants were ineligible if there was history of neuropathy due to non-diabetic cause or current diabetic foot ulcer or infection. Corneal sensitivity threshold was measured on the eye of dominant hand side at a distance of 10 mm from the center of the cornea using a stimulus duration of 0.9 s. The NDS was measured producing a score ranging from 0 to 10. To determine the optimal cutoff point of corneal sensitivity that identified the presence of neuropathy (diagnosed by NDS), the Youden index and “closest-to-(0,1)” criteria were used. Results. The receiver-operator characteristic curve for NCCE for the presence of neuropathy (NDS ≥3) had an area under the curve of 0.73 (p = 0.001) and, for the presence of moderate neuropathy (NDS ≥6), area of 0.71 (p = 0.003). By using the Youden index, for an NDS ≥3, the sensitivity of NCCE was 70% and specificity was 75%, and a corneal sensitivity threshold of 0.66 mbar or higher indicated the presence of neuropathy. When NDS ≥6 (indicating risk of foot ulceration) was applied, the sensitivity was 52% with a specificity of 85%. Conclusions. NCCE is a sensitive test for the diagnosis of minimal and more advanced diabetic neuropathy and may serve as a useful surrogate marker for diabetic and perhaps other neuropathies.
Resumo:
In the study of traffic safety, expected crash frequencies across sites are generally estimated via the negative binomial model, assuming time invariant safety. Since the time invariant safety assumption may be invalid, Hauer (1997) proposed a modified empirical Bayes (EB) method. Despite the modification, no attempts have been made to examine the generalisable form of the marginal distribution resulting from the modified EB framework. Because the hyper-parameters needed to apply the modified EB method are not readily available, an assessment is lacking on how accurately the modified EB method estimates safety in the presence of the time variant safety and regression-to-the-mean (RTM) effects. This study derives the closed form marginal distribution, and reveals that the marginal distribution in the modified EB method is equivalent to the negative multinomial (NM) distribution, which is essentially the same as the likelihood function used in the random effects Poisson model. As a result, this study shows that the gamma posterior distribution from the multivariate Poisson-gamma mixture can be estimated using the NM model or the random effects Poisson model. This study also shows that the estimation errors from the modified EB method are systematically smaller than those from the comparison group method by simultaneously accounting for the RTM and time variant safety effects. Hence, the modified EB method via the NM model is a generalisable method for estimating safety in the presence of the time variant safety and the RTM effects.
Resumo:
We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms by using indirect inference. ABC methods are useful for posterior inference in the presence of an intractable likelihood function. In the indirect inference approach to ABC the parameters of an auxiliary model fitted to the data become the summary statistics. Although applicable to any ABC technique, we embed this approach within a sequential Monte Carlo algorithm that is completely adaptive and requires very little tuning. This methodological development was motivated by an application involving data on macroparasite population evolution modelled by a trivariate stochastic process for which there is no tractable likelihood function. The auxiliary model here is based on a beta–binomial distribution. The main objective of the analysis is to determine which parameters of the stochastic model are estimable from the observed data on mature parasite worms.
Resumo:
This work proposes to improve spoken term detection (STD) accuracy by optimising the Figure of Merit (FOM). In this article, the index takes the form of phonetic posterior-feature matrix. Accuracy is improved by formulating STD as a discriminative training problem and directly optimising the FOM, through its use as an objective function to train a transformation of the index. The outcome of indexing is then a matrix of enhanced posterior-features that are directly tailored for the STD task. The technique is shown to improve the FOM by up to 13% on held-out data. Additional analysis explores the effect of the technique on phone recognition accuracy, examines the actual values of the learned transform, and demonstrates that using an extended training data set results in further improvement in the FOM.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
This article explores the use of probabilistic classification, namely finite mixture modelling, for identification of complex disease phenotypes, given cross-sectional data. In particular, if focuses on posterior probabilities of subgroup membership, a standard output of finite mixture modelling, and how the quantification of uncertainty in these probabilities can lead to more detailed analyses. Using a Bayesian approach, we describe two practical uses of this uncertainty: (i) as a means of describing a person’s membership to a single or multiple latent subgroups and (ii) as a means of describing identified subgroups by patient-centred covariates not included in model estimation. These proposed uses are demonstrated on a case study in Parkinson’s disease (PD), where latent subgroups are identified using multiple symptoms from the Unified Parkinson’s Disease Rating Scale (UPDRS).
Resumo:
Purpose: To examine the impact of different endotracheal tube (ETT) suction techniques on regional end-expiratory lung volume (EELV) and tidal volume (VT) in an animal model of surfactant-deficient lung injury. Methods: Six 2-week old piglets were intubated (4.0 mm ETT), muscle-relaxed and ventilated, and lung injury was induced with repeated saline lavage. In each animal, open suction (OS) and two methods of closed suction (CS) were performed in random order using both 5 and 8 French gauge (FG) catheters. The pre-suction volume state of the lung was standardised on the inflation limb of the pressure-volume relationship. Regional EELV and VT expressed as a proportion of the impedance change at vital capacity (%ZVCroi) within the anterior and posterior halves of the chest were measured during and for 60 s after suction using electrical impedance tomography. Results: During suction, 5 FG CS resulted in preservation of EELV in the anterior (nondependent) and posterior(dependent) lung compared to the other permutations, but these only reached significance in the anterior regions (p\0.001 repeated-measures ANOVA). VT within the anterior, but not posterior lung was significantly greater during 5FG CS compared to 8 FG CS; the mean difference was 15.1 [95% CI 5.1, 25.1]%ZVCroi. Neither catheter size nor suction technique influenced post-suction regional EELV or VT compared to pre-suction values (repeated-measures ANOVA). Conclusions: ETT suction causes transient loss of EELV and VT throughout the lung. Catheter size exerts a greater influence than suction method, with CS only protecting against derecruitment when a small catheter is used, especially in the non-dependent lung.
Resumo:
We consider the problem of how to efficiently and safely design dose finding studies. Both current and novel utility functions are explored using Bayesian adaptive design methodology for the estimation of a maximum tolerated dose (MTD). In particular, we explore widely adopted approaches such as the continual reassessment method and minimizing the variance of the estimate of an MTD. New utility functions are constructed in the Bayesian framework and are evaluated against current approaches. To reduce computing time, importance sampling is implemented to re-weight posterior samples thus avoiding the need to draw samples using Markov chain Monte Carlo techniques. Further, as such studies are generally first-in-man, the safety of patients is paramount. We therefore explore methods for the incorporation of safety considerations into utility functions to ensure that only safe and well-predicted doses are administered. The amalgamation of Bayesian methodology, adaptive design and compound utility functions is termed adaptive Bayesian compound design (ABCD). The performance of this amalgamation of methodology is investigated via the simulation of dose finding studies. The paper concludes with a discussion of results and extensions that could be included into our approach.