Stop the brain
The brain interprets incoming sensation (I hesitate to say all sensation). For this gross understatement to contain functional truth neuroscience tells us that although we have the very vast majority of our 10 billion + neurons on board by the time we are born we constantly alter their connections to each other. Each of our sensations ends in the halls of memory and prior sensation. These connect, almost stochastically, to complete the experience of sensing which then gets stored in order that it may be recalled and connected to in the future, by incoming sensation.
Most developed systems of meditative practice have similar aims. To still these processes of neural phenomenological interpretation so that only the process of processing remains. In other words the brain experiences itself in its natural state, without interpreting incoming sensory information. To this end various death postures and corpse postures build the habit of pratyahara or withdrawal of the senses. Turn the senses in on themselves and send them back to their source.
It could be argued that while our interpretation of sensory information has many flaws due to its being processed through memory and emotion, the information we receive from the outside world via our receptors has its root in some type of objective truth, or objective reality. Any of our sensory perceptions enter the body, they pass through a complex relay mechanism en route the brain.
In order for visual information to have a neural interpretation a visual stimulus must first pass through the cornea, and lens and into the retina. Here it works its way through the photoreceptive cells (rods and cones). Each individual rod and cone can be connected laterally as they connect to their bipolar cells allowing for a large variety of ‘choice’ (or noise) at this early stage. Next the signal generally passes through bipolar cells, ganglion cells, photosensitive ganglion cells and finally to the optic nerve for transmission to the brain.
This transmission doesn’t have a dependency on generating a threshold of current called an action potential to transmit information to the brain, there is a constant flow of electricity through this set up to the brain, but the current has gradation, more stimulus=more current.
The signal passes through the optic nerve into the optic chiasm. Here fully half of the information received is passed to the contra lateral portion of the brain from its site of reception. From here it passes up the optic tract and into the lateral geniculate body, where it finally gets passed to the visual cortex in the occiptal lobe for processing. Whew. Even here the information gets divided between the primary and secondary visual cortex depending on whether it requires clear interpretation or can be classified in terms of movement, shape, position, etc with the result sent to the cerebrum so action can initiate from the motor control areas. A phenomenally long trip from eye to interpretation.
This much shorter trip starts with the outside of the ear or ‘pinna’ which funnels vibration to the middle ear where the maleus, incus and stapes vibrate causing the ear drum to vibrate and transmit to the inner ear. In the inner ear the information travels through the cochlea, vibrating the organ of corti (which lives inside the cochlea). In the organ of corti the vibration then causes the movement of tiny hair cells. This movement initiates an action potential which then travels up the auditory nerve, which joins the vesibular nerve, which joins the vestibulocochlear which connects to the cochlear nucleus in the brain stem. From here to the superior olivary nucleus, then to the inferior colliculus, the medial geniculate body and finally the auditory cortex in the temporal lobe.
Importantly the location of the temporal lobe lies distinctly closer to the anatomical position of the ear then the eyes do to the occipital lobe (which lies in the back of the head). The trip from ear to primary neural interpretation site has less length than does the trip from eye to primary neural interpretation site.
Both of these examples oversimplify for the sake of brevity.
To empirically experience the difference in pathway length between these two senses and their interpretation grab a ruler and have a friend hold it at the one inch mark just above your thumb and forefinger which should be ready to catch it.
For the first part: keep your eyes open and have your friend drop the ruler, catch it as soon as you see it fall.
For the second part: close your eyes and have your friend say “go!” just as he or she drops the ruler.
Keep in mind how many 1/8s of an inch passed between the visual initiation and your ability to catch the ruler and compare this with how many passed between the auditory initiation and your ability to catch the ruler.
Even the best lip reader in the world has a non-zero rate of failure. Interesting things happen when these two senses get crossed and one tries to do the work of the other. When a person sees lips moving with no sound, these movements get interpreted through experience and 'best-guessing' within the brain, the result often has some internally created sound that often bears no resemblance to the sound those moving lips would make. Just as pronunciation conventions often change from country to country and each visual representation of phoneme can be interpreted in a variety of ways specifically in the untrained.
When both common audio and visual misinterpretations are exploited and combined we get the McGurk effect. If a video of lips mouthing one particular sound is combined with a different sound, the human interpretation will have a third new sound with no bearing in the reality of the actual sounds recorded in either the video or the audio. Just goes to show how little we can know from the outside world. These examples give some credence and importance to the theory that knowledge about actual reality beyond our brain's interpretation of phenomenon happens through quieting the senses.