A Concept about Directionality In Sound Perception

We all understand that given two ears, we can hear panning when we listen to reproduced stereo, as well as maybe that sounds seem to come ‘from outside’ as opposed to ‘from inside’, corresponding to out-of-phase as opposed to in-phase. But the reality of human sound perception is, that we are supposed to be capable of more subtle perception, about the location of the origin of sounds. I will call this more subtle perception of directions, ‘complete stereo-directionality’.

One idea which some people have pursued, is that we do not just hear amplitudes associated with frequencies, but that we might be able to perceive phase-vectors associated with frequencies as well. This idea seems to agree with the fact that at least a part of our complete stereo-directionality seems to be based on Inter-Aural-Time-Differences, as a basis for perceiving direction. This idea also seems to agree well with the fact that in Science, and with Machines, the amplitude of any frequency component, can be represented by a complex number.

But this idea does not seem to agree well, with the fact that our ultimate organ to perceive sound is not the outer ear, nor the middle ear, but the inner ear, which is also known as the cochlea. As I understand it, the cochlea is capable of differentiating along frequency-mappings incredibly precisely, but not along phase-relationships.

Now, some reason may exist to think, that the middle ear and the skull carry out some sort of mixing of sounds, that enter the outer ear, before those sounds reach the cochlea. But for the moment, I am going to regard this detail as secondary.

I think that what ultimately happens, is that on the cerebral cortex, just as it goes with the optical lobes, the aural lobes have a mapping of fingerprint-like ‘ridges’. The long-range mapping may be according to frequency, but the short-range mapping may be such, that one set of ridges corresponds to input from one ear, while the negative of that same pattern of ridges, represents the input of the opposite ear.

And so what the cerebral cortex can do, is make very precise differentiations in its short-range neural systems, between what any one frequency-component has as amplitude, as perceived by one cochlea differently from the other cochlea.

When sound events reach our ears, they can follow many paths, as well as perhaps being mixed as well by our middle ear, so that real phase positions lead to subtle amplitude-differences, as sensed by our cochlea, and as interpreted by our cerebral cortex with its ridged mappings. Inter-Aural Time-Differences may also lead to subtle differences in per-frequency amplitudes, by the time they reach the cochlea.

And I suspect that the latter is what leads to our ‘complete stereo-directionality’.


What this would also mean, is that in lossy sound compression, if the programmers decided to compute a Fourier Transform of each stereo channel first – and the Discreet Cosine Transform is one type of Fourier Transform – and then to store the differences between absolute amplitudes that result, they may quite accidentally have processed the sound closer to how human hearing processes sound.

If instead, the programmers chose to compute the L-R component in the time-domain first, and then to perform some Fourier Transform of L+R and L-R secondly, they may have been intending to capture more information than can be captured in the other way. But they may have captured information with this method, that human hearing is not able to interpret well.

This would be especially true then, in cases where L and R mainly cancel, so that the amplitude of L+R is low, while the Fourier Amplitude of L-R would be high.

This might sound fascinating due to whatever our middle ear next does with it, but does not lead to meaningful interpretations, of ‘where that sound even supposedly comes from’. Hence, while this could be psychedelic, it would not enhance our ‘complete stereo-directionality’.

Also, the idea may be applied by our brain, that whatever sound we are focusing on, ‘all the other sounds’ form a continuous background noise, such that the sound we are focusing on may seem to have negative amplitudes, because real amplitudes locally become lower than the virtual noise levels. And while this may allow us to derive some sort of perception of phase-cancellation, it may not actually be due, to our cochlea having picked up phase-cancellation.

Dirk