2 resultados para Retinal image quality

em National Center for Biotechnology Information - NCBI


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The visual stimuli that elicit neural activity differ for different retinal ganglion cells and these cells have been categorized by the visual information that they transmit. If specific visual information is conveyed exclusively or primarily by a particular set of ganglion cells, one might expect the cells to be organized spatially so that their sampling of information from the visual field is complete but not redundant. In other words, the laterally spreading dendrites of the ganglion cells should completely cover the retinal plane without gaps or significant overlap. The first evidence for this sort of arrangement, which has been called a tiling or tessellation, was for the two types of "alpha" ganglion cells in cat retina. Other reports of tiling by ganglion cells have been made subsequently. We have found evidence of a particularly rigorous tiling for the four types of ganglion cells in rabbit retina that convey information about the direction of retinal image motion (the ON-OFF direction-selective cells). Although individual cells in the four groups are morphologically indistinguishable, they are organized as four overlaid tilings, each tiling consisting of like-type cells that respond preferentially to a particular direction of retinal image motion. These observations lend support to the hypothesis that tiling is a general feature of the organization of information outflow from the retina and clearly implicate mechanisms for recognition of like-type cells and establishment of mutually acceptable territories during retinal development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The primate visual motion system performs numerous functions essential for survival in a dynamic visual world. Prominent among these functions is the ability to recover and represent the trajectories of objects in a form that facilitates behavioral responses to those movements. The first step toward this goal, which consists of detecting the displacement of retinal image features, has been studied for many years in both psychophysical and neurobiological experiments. Evidence indicates that achievement of this step is computationally straightforward and occurs at the earliest cortical stage. The second step involves the selective integration of retinal motion signals according to the object of origin. Realization of this step is computationally demanding, as the solution is formally underconstrained. It must rely--by definition--upon utilization of retinal cues that are indicative of the spatial relationships within and between objects in the visual scene. Psychophysical experiments have documented this dependence and suggested mechanisms by which it may be achieved. Neurophysiological experiments have provided evidence for a neural substrate that may underlie this selective motion signal integration. Together they paint a coherent portrait of the means by which retinal image motion gives rise to our perceptual experience of moving objects.