Bifocals in the brain: The visual information from near and far space processed with different degrees of sharpness ;
Neuroscientists have discovered Tubingen how our brain processes visual stimuli above and below the horizon different. Researchers led by Dr. Ziad Hafed Werner Reichardt Centre for Integrative Neuroscience (CIN) at the University of Tubingen investigated nonhuman primates, the determination that the different parts of the visual field are represented asymmetrically in the superior colliculus, one central structure of the brain visual perception and behavior. More neural tissue to which the lower top field is assigned. As a result, visual stimuli above the horizon are processed sharper, stronger and faster: our brains are wearing bifocals, so to speak.
Seeing – arguably the most important way of perceiving the world – especially occurs without conscious intent. We see much better in the center of our visual field (along the visual axis) than in the periphery. So when our brain detects an object of interest in the periphery of our visual field, immediately start a movement of the eyes so our visual axis intersects with those objects. Once an object is in our line of sight, we can see that in more depth and detail.
This is partly due to the much higher density of photoreceptor cells in a very small area in the center of the retina – the fovea. But the preference of visual perception to the center of our visual field is also represented in the brain: is reflected in brain structures that process the stimuli transmitted from the fovea. For example, in the superior colliculus (SC) – an area of the midbrain that initiates eye movements to peripheral stimuli directly based on input from the eyes – and much more neural tissue is dedicated to processing fovea signals that peripheral processing signals. This phenomenon is called foveal enlargement.
Now Dr. Alhafed equipment has proved that besides the fovea, other parts of the visual field are also ‘magnified’ SC. Their findings reveal that the currently accepted model of the SC, which represents only increase the fovea, is not enough. This simple model is, therefore, assume that our SC look at the world through a magnifying glass: The closer to the center of our visual field of an object is, the more clearly it is picked up by specific neurons, and more of such neurons are dedicated to processing.
Dr. Hafed The new model modifies the image, adding extension of the top field at the top of the extension of the fovea. His team has found that the upper half of the visual field is represented by SC receptive fields that are much smaller, more finely tuned to the spatial structure of the received images, and more sensitive to the contrast image. Furthermore, the bottom field is represented in a lower resolution. Therefore, the team Hafed think of the “lens” in the SC more like bifocals.
For Dr. Hafed, asymmetry in the neural representation is adapted to our everyday environment. distant objects smaller images are projected onto our retina than nearby objects. Therefore, processing of images of nearby objects in a way that allows us to quickly and usefully react to need a lower resolution than those far away. “In our three-dimensional environments, objects in the lower half of our visual field are usually near a portion of the interior space. An example would be the instruments in a car while driving, which are low and close to us,” explains Hafed . ‘Meanwhile, away objects in space, as a next intersection, they are in the upper half of our visual field. In order to be able to precisely focus on objects that are far away, that intuitively we need a higher resolution in the top field. Our experiments provide substantial evidence that the old model, with its symmetrical representation of the upper and lower visual field of the SC, will have to be reconsidered. “
Findings Hafed team can greatly benefit the design of the user interface in augmented reality (AR) and virtual reality systems (VR). His idea that the SC ” bifocals translate directly into faster and more accurate towards the top field eye movements could help in the strategic placement of essential information that requires a quick orientation. AR and VR systems have large and wraparound screens that cover almost the entire visual field. In such scenarios, computer systems have great freedom in placing critical comments from users, and optimization of the placement in accordance with the “human factor” currently represents a major engineering challenge.
This article was originally published on medicalxpress, Read the original article
Posted in: Neuroscience