267 resultados para wavefront vergence


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optical coherence tomography (OCT) is a noninvasive three-dimensional interferometric imaging technique capable of achieving micrometer scale resolution. It is now a standard of care in ophthalmology, where it is used to improve the accuracy of early diagnosis, to better understand the source of pathophysiology, and to monitor disease progression and response to therapy. In particular, retinal imaging has been the most prevalent clinical application of OCT, but researchers and companies alike are developing OCT systems for cardiology, dermatology, dentistry, and many other medical and industrial applications.

Adaptive optics (AO) is a technique used to reduce monochromatic aberrations in optical instruments. It is used in astronomical telescopes, laser communications, high-power lasers, retinal imaging, optical fabrication and microscopy to improve system performance. Scanning laser ophthalmoscopy (SLO) is a noninvasive confocal imaging technique that produces high contrast two-dimensional retinal images. AO is combined with SLO (AOSLO) to compensate for the wavefront distortions caused by the optics of the eye, providing the ability to visualize the living retina with cellular resolution. AOSLO has shown great promise to advance the understanding of the etiology of retinal diseases on a cellular level.

Broadly, we endeavor to enhance the vision outcome of ophthalmic patients through improved diagnostics and personalized therapy. Toward this end, the objective of the work presented herein was the development of advanced techniques for increasing the imaging speed, reducing the form factor, and broadening the versatility of OCT and AOSLO. Despite our focus on applications in ophthalmology, the techniques developed could be applied to other medical and industrial applications. In this dissertation, a technique to quadruple the imaging speed of OCT was developed. This technique was demonstrated by imaging the retinas of healthy human subjects. A handheld, dual depth OCT system was developed. This system enabled sequential imaging of the anterior segment and retina of human eyes. Finally, handheld SLO/OCT systems were developed, culminating in the design of a handheld AOSLO system. This system has the potential to provide cellular level imaging of the human retina, resolving even the most densely packed foveal cones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les télescopes de grande envergure requièrent des nouvelles technologies ayant un haut niveau de maturité technologique. Le projet implique la création d’un banc de test d’optique adaptative pour l’évaluation de la performance sur le ciel de dispositifs connexes. Le banc a été intégré avec succès à l’observatoire du Mont Mégantic, et a été utilisé pour évaluer la performance d’un senseur pyramidal de front d’onde. Le système a permis la réduction effective de la fonction d’étalement du point d’un facteur deux. Plusieurs améliorations sont possibles pour augmenter la performance du système.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessing the range of vergence provides information about the patient’s ability to maintain the binocular vision. Disparity vergence measurements should be used to quantify control of an underlying eye misalignment. In the presence of a manifest deviation the testing is performed by first compensating the angle of deviation to determine prognosis. Type of deviation: a) in an exophoria there is an increase in the fast fusional convergence while in an esophoric deviation there is an increase in reflex fusional divergence to attain binocular single vision; b) convergence fusion amplitudes have been found to correlate with control of the exodeviation; c) there is a greater BO range for esos and greater BI range for exos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the application of time-reversed electromagnetic wave propagation to transmit energy in a wireless power transmission system. “Time reversal” is a signal focusing method that exploits the time reversal invariance of the lossless wave equation to direct signals onto a single point inside a complex scattering environment. In this work, we explore the properties of time reversed microwave pulses in a low-loss ray-chaotic chamber. We measure the spatial profile of the collapsing wavefront around the target antenna, and demonstrate that time reversal can be used to transfer energy to a receiver in motion. We demonstrate how nonlinear elements can be controlled to selectively focus on one target out of a group. Finally, we discuss the design of a rectenna for use in a time reversal system. We explore the implication of these results, and how they may be applied in future technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current coastal-evolution models generally lack the ability to accurately predict bed level change in shallow (<~2 m) water, which is, at least partly, due to the preclusion of the effect of surface-induced turbulence on sand suspension and transport. As a first step to remedy this situation, we investigated the vertical structure of turbulence in the surf and swash zone using measurements collected under random shoaling and plunging waves on a steep (initially 1:15) field-scale sandy laboratory beach. Seaward of the swash zone, turbulence was measured with a vertical array of three Acoustic Doppler Velocimeters (ADVs), while in the swash zone two vertically spaced acoustic doppler velocimeter profilers (Vectrino profilers) were applied. The vertical turbulence structure evolves from bottom-dominated to approximately vertically uniform with an increase in the fraction of breaking waves to ~ 50%. In the swash zone, the turbulence is predominantly bottom-induced during the backwash and shows a homogeneous turbulence profile during uprush. We further find that the instantaneous turbulence kinetic energy is phase-coupled with the short-wave orbital motion under the plunging breakers, with higher levels shortly after the reversal from offshore to onshore motion (i.e. wavefront).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, we will introduce the innovative concept of a plenoptic sensor that can determine the phase and amplitude distortion in a coherent beam, for example a laser beam that has propagated through the turbulent atmosphere.. The plenoptic sensor can be applied to situations involving strong or deep atmospheric turbulence. This can improve free space optical communications by maintaining optical links more intelligently and efficiently. Also, in directed energy applications, the plenoptic sensor and its fast reconstruction algorithm can give instantaneous instructions to an adaptive optics (AO) system to create intelligent corrections in directing a beam through atmospheric turbulence. The hardware structure of the plenoptic sensor uses an objective lens and a microlens array (MLA) to form a mini “Keplerian” telescope array that shares the common objective lens. In principle, the objective lens helps to detect the phase gradient of the distorted laser beam and the microlens array (MLA) helps to retrieve the geometry of the distorted beam in various gradient segments. The software layer of the plenoptic sensor is developed based on different applications. Intuitively, since the device maximizes the observation of the light field in front of the sensor, different algorithms can be developed, such as detecting the atmospheric turbulence effects as well as retrieving undistorted images of distant objects. Efficient 3D simulations on atmospheric turbulence based on geometric optics have been established to help us perform optimization on system design and verify the correctness of our algorithms. A number of experimental platforms have been built to implement the plenoptic sensor in various application concepts and show its improvements when compared with traditional wavefront sensors. As a result, the plenoptic sensor brings a revolution to the study of atmospheric turbulence and generates new approaches to handle turbulence effect better.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.

The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.

We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.

The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.

To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.

A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.

One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The visual demands of modern classrooms are poorly understood yet are relevant in determining the levels of visual function required to perform optimally within this environment. METHODS: Thirty-three Year 5 and 6 classrooms from eight south-east Queensland schools were included. Classroom activities undertaken during a full school day (9 am to 3 pm) were observed and a range of measurements recorded, including classroom environment (physical dimensions, illumination levels), text size and contrast of learning materials, habitual working distances (distance and estimated for near) and time spent performing various classroom tasks. These measures were used to calculate demand-related minimum criteria for distance and near visual acuity, contrast and sustained use of accommodation and vergence. RESULTS: The visual acuity demands for distance and near were 0.33 ± 0.13 and 0.72 ± 0.09 logMAR, respectively (using habitual viewing distances and smallest target sizes) or 0.33 ± 0.09 logMAR assuming a 2.5 times acuity reserve for sustained near tasks. The mean contrast levels of learning materials at distance and near were greater than 70 per cent. Near tasks (47 per cent) dominated the academic tasks performed in the classroom followed by distance (29 per cent), distance to near (15 per cent) and computer-based (nine per cent). On average, children engaged in continuous near fixation for 23 ± 5 minutes at a time and during distance-near tasks performed fixation changes 10 ± 1 times per minute. The mean estimated habitual near working distance was 23 ± 1 cm (4.38 ± 0.24 D accommodative demand) and the vergence demand was 0.86 ± 0.07(Δ) at distance and 21.94 ± 1.09(Δ) at near assuming an average pupillary distance of 56 mm. CONCLUSIONS: Relatively high levels of visual acuity, contrast demand and sustained accommodative-convergence responses are required to meet the requirements of modern classroom environments. These findings provide an evidence base to inform prescribing guidelines and develop paediatric vision screening protocols and referral criteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Little is known about the prevalence of refractive error, binocular vision, and other visual conditions in Australian Indigenous children. This is important given the association of these visual conditions with reduced reading performance in the wider population, which may also contribute to the suboptimal reading performance reported in this population. The aim of this study was to develop a visual profile of Queensland Indigenous children. METHODS: Vision testing was performed on 595 primary schoolchildren in Queensland, Australia. Vision parameters measured included visual acuity, refractive error, color vision, nearpoint of convergence, horizontal heterophoria, fusional vergence range, accommodative facility, AC/A ratio, visual motor integration, and rapid automatized naming. Near heterophoria, nearpoint of convergence, and near fusional vergence range were used to classify convergence insufficiency (CI). RESULTS: Although refractive error (Indigenous, 10%; non-Indigenous, 16%; p = 0.04) and strabismus (Indigenous, 0%; non-Indigenous, 3%; p = 0.03) were significantly less common in Indigenous children, CI was twice as prevalent (Indigenous, 10%; non-Indigenous, 5%; p = 0.04). Reduced visual information processing skills were more common in Indigenous children (reduced visual motor integration [Indigenous, 28%; non-Indigenous, 16%; p < 0.01] and slower rapid automatized naming [Indigenous, 67%; non-Indigenous, 59%; p = 0.04]). The prevalence of visual impairment (reduced visual acuity) and color vision deficiency was similar between groups. CONCLUSIONS: Indigenous children have less refractive error and strabismus than their non-Indigenous peers. However, CI and reduced visual information processing skills were more common in this group. Given that vision screenings primarily target visual acuity assessment and strabismus detection, this is an important finding as many Indigenous children with CI and reduced visual information processing may be missed. Emphasis should be placed on identifying children with CI and reduced visual information processing given the potential effect of these conditions on school performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le sujet de cette thèse est la caractérisation dynamique d’une surface générée par plusieurs actionneurs d’un miroir ferrofluidique intégré dans un montage d’optique adaptative. Elle conclut un projet de doctorat au sein du groupe Borra s’étant échelonné sur 4 années. Un bref portrait général de l’optique adaptative est d’abord présenté, suivi d’une section sur la théorie du contrôle. Les types de contrôleurs y sont abordés, soit PID et surcharge. L’effet de la viscosité sur la réponse dynamique du système, ainsi que les analyseurs de front d’onde utilisés sont ensuite expliqués. La section résultats est subdivisée en plusieurs sous-sections ordonnées de façon chronologique. Dans un premier temps, il est question des résultats obtenus dans le cadre d’une caractérisation d’un nouveau miroir de 91 actionneurs fabriqué au sein du groupe. Il est ensuite question des résultats obtenus avec diverses techniques telles le PSD et l’imagerie déclenchée. Il y aura toute une section sur les résultats en vitesse, en fonction de la viscosité du liquide, suivie d’une section sur les simulations réalisées avec Simulink afin de bien cibler les limites du système. Les résultats portant sur la technique de surcharge des actionneurs seront ensuite présentés avec des projections futures. La dernière partie de cette thèse portera sur une innovation apportée par un autre membre du groupe. Nous parlerons de la déposition d’une membrane élastomère réfléchissante et de ses effets sur la dynamique du système.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop an algorithm and computational implementation for simulation of problems that combine Cahn–Hilliard type diffusion with finite strain elasticity. We have in mind applications such as the electro-chemo- mechanics of lithium ion (Li-ion) batteries. We concentrate on basic computational aspects. A staggered algorithm is pro- posed for the coupled multi-field model. For the diffusion problem, the fourth order differential equation is replaced by a system of second order equations to deal with the issue of the regularity required for the approximation spaces. Low order finite elements are used for discretization in space of the involved fields (displacement, concentration, nonlocal concentration). Three (both 2D and 3D) extensively worked numerical examples show the capabilities of our approach for the representation of (i) phase separation, (ii) the effect of concentration in deformation and stress, (iii) the effect of Electronic supplementary material The online version of this article (doi:10.1007/s00466-015-1235-1) contains supplementary material, which is available to authorized users. B P. Areias pmaa@uevora.pt 1 Department of Physics, University of Évora, Colégio Luís António Verney, Rua Romão Ramalho, 59, 7002-554 Évora, Portugal 2 ICIST, Lisbon, Portugal 3 School of Engineering, Universidad de Cuenca, Av. 12 de Abril s/n. 01-01-168, Cuenca, Ecuador 4 Institute of Structural Mechanics, Bauhaus-University Weimar, Marienstraße 15, 99423 Weimar, Germany strain in concentration, and (iv) lithiation. We analyze con- vergence with respect to spatial and time discretization and found that very good results are achievable using both a stag- gered scheme and approximated strain interpolation.