Is listening to music just about the ears, or are we swimming in a sensorium? DR RICHARD VAREY burrows down into this fascinating topic.
Recently, I’ve been in the dark about listening to music. I’ve been doing it on purpose, as part of an experiment in adjusting my sensorium for better listening. The things we do for the love of great music! You see, we see before we listen, when listening is what we say we’re doing. And this changes everything.
When Roger Daltrey sang “See me, feel me, touch me … listening to you, I get the music”, he got it part right. If he had sung “See me, hear me, feel me, smell me, taste me”, we would have got the whole ‘sense-ational’ story. We experience the world around us with multiple senses, and even music listening is multi-sensory, with at least hearing, seeing, and feeling active. And that’s not just hearsay.
We can’t hear without also seeing, unless we intentionally close or mask our eyes. Does that make our hearing and listening beneficially different if we take away the visual stimulus? It certainly removes the most substantial source of attention and distraction. If our listening is emphasised in this way, is our musical experience better or even different?
Marshall McLuhan described the character of the senses of our sensorium – the total character of the unique and changing sensory environments we perceive. These include the sensation, perception, and interpretation of information about the world around us by using faculties of the mind such as senses, phenomenal and psychological perception, cognition, and intelligence. Thus, as we knew all along, sound quality – indeed music experience generally – is entirely subjective, although we mostly agree on the names of objects in the sound field, such as performers and instruments, and recording playback machines, so that part is somewhat objective.
In Western cultures we see, hear, smell, taste, feel. This is the sensorium revealed by Aristotle so long ago. It varies among people and with age and health, in terms of sensitivity, acuity, range, immediacy of response (through direct or indirect communication with the brain). This is vital as we sense, interpret, value, and respond to our situations through our taste, touch, smell, hearing, sight, and imagination. In Western cultures, our senses have thus long been defined as organ-based, providing us with a bodily way of gathering information about our selves and our surroundings.
More recently, science suggests we have at least 10 senses, and perhaps as many as 33! The senses are not independent, they affect each other. At the cognitive level, auditory and visual parts of the brain are connected. When information from a sense is ambiguous, another sense can contribute to clarifying or ratifying perception and interpretation (sensemaking). Sound affects vision, as we shall ‘see’, and even when sound is irrelevant to a task, it influences the way we see. Might it be that seeing affects hearing? Yes, indeed. Is hearing more important than seeing? Hearing is fundamental to our most significant activities, such as speech and music. Yet hearing also adds richness and context to even the mundane: think of water splashing, footsteps, thunder and so on. The awareness these create helps us better see what we hear.
In 2015, the Tate Britain in London opened the immersive Tate Sensorium exhibition to allow visitors to look at art, but also to smell, feel, and hear it. These same senses are active when handling and playing records, and to a lesser extent with CDs (no, I don’t sniff mine, thank you very much!). With digital file playback however, we only hear the sound, unless we stare at the machines while listening.
There is a natural emphasis on your auditory stimuli when you are consciously the listener, and yet most people now hold vision to be more valuable than hearing. Our modern mind is increasingly emphasising visual over aural. We can ‘see’ (there it is again, right there, the modern visual bias) this in common sayings such as ‘see what I mean’, ‘I get the picture’, in the ascent of PowerPoint, and in the categorisation of learning styles (visual mode – pictures, kinesthetic/tactile mode – gestures, auditory mode – listening).
It hasn’t always been this way. Our recent ancestors have experienced a transformation of the sensorium from primary aurality to primary visuality. The human sensorium has undergone two great transformations, with two momentous human inventions: spoken language, then writing. The first made us listen to what we hear. The latter turned our minds onto linear sequential thinking and expression. Our sensorium of today is something quite different to that of our ancient ancestors.
Is there an order or sequence to the senses – a hierarchy of importance? Shaped by biological predispositions and cultural influences, there indeed is such an order of preference, albeit subconscious most of the time. Cultures have different sensory orders and differing sensory capabilities, and thus biases. Many of us in Western cultures think that sight is naturally the primary sense, but is hearing (auditory perception, thus ‘audition’) more significant for awareness of our situation, and thus safety, for communication and other interaction, and for enjoyment and emotion?
McLuhan explained to us in the 1960s that society was entering an electronic era, following the print era that had superseded the literate era, which came after the tribal era. He also made clear that these are eras of primary focus, not total substitutions, so there is always residual attention to what came before. Even now, tribal, literate, and print remain, although the electronic is already dominating. For example, online messaging and email are written forms of speech, conveyed in electronic circuits. The record and CD are equivalents of the book. All are printed media that carry recordings.
He also pointed to the effect on the person (psyche) and society (culture) of shifting from an oral (holistic all-at-once) culture to a literary (reductionist linear, sequential) culture and beyond. Type and printing, followed by digitised/pixelated text (a form of type) assured the eye a position of total predominance in our sensorium. We’ve become so visually-oriented that our listening has to be worked at, and stereo is an example of linear/directional attention – the sweet spot has a single location in relation to two sound sources.
In the emergence of the modern world, oral (aural) gave way to a written (visual) culture, so is it any surprise that the sound quality of the music experience is no longer primary for most, and that picture, video and graphics have become essential (think of the advent of MTV) as music exists in the visual world and the ear is subservient to the eye? Over centuries and accelerated in the industrial revolution and the electronic era, there has been a large-scale shift from an ear/hearing (oral) culture to a eye/seeing (visual) culture. Seeing is emphasised in our daily experience – think of the credence in the law of ‘eyewitness’ compared with mere ‘hearsay’. What about a person who emphasises listening or hears before seeing, or ‘hears’ more (with greater acuity) than ‘sees’? Some of us have retained a residual ‘acute auditory’ preference, and this is why and how only a small minority are somewhat more attuned and sensitive to music, and some others even seem to have no interest!
The visual field is ordinarily less than 180 degrees due to our forward-facing eyes, and visual stimulus can be reduced to zero/switched off by closing your eyes, whereas the auditory field is a total (approximate) sphere at each ear and always on unless you stick something in your ears! The natural sound field is 360 degrees in all directions, so we can judge location of a sound source as sound waves are propagated through the air as pressure changes.
Listening is a complex affective, cognitive, and behavioural process in support of brain function. In acoustic space hearing is always ‘on’ and auditory sensation is directionless. There is no fixed position of information sources, whereas pictorial space consists of objects with locality, and closed out when our eyes are closed. When there is no object, there is no representation, and the stimuli impinge direct to emotion. Hearing is quite different to seeing which accesses the brain’s logic and rational faculty. Thinking back to my recent article on my aversion to easy listening music, I think that this explains the effect of easy listening in avoiding strong emotion and sensory over-stimulation. I suggest that committed or immersive listening is appreciative listening in that there is a response to music enjoyed, and the valuation may be positive, but may be negative, and generally, then, we stop listening!
So now I come to my own situation. Visual stimulation at my listening seat is very substantial during daytime (see photo). The very engaging and quite stimulating and enjoyable view of my garden is more than half the area of the wall opposite my seat, and it’s usually bright and colourful. I think it affects my listening! Can my experience of music listening be enhanced by focusing sensory perception on hearing? To what extent can this be accomplished by reducing or removing sight? In the light, we look around to see, while in the dark, we listen. A hi-fi friend asked why listening to music is better after dark – my answer was a quieter mind, and less distraction.
You can experiment to find out how true this is by listening to music in several differing sessions with various levels of light/sight stimulus reduction and removal; a form of selective sensory deprivation. This is something you definitely can and should try at home! What I’m suggesting is to intentionally shift from seeing (with listening), to listening only (without seeing). This applies to both loudspeakers and headphones.
- With eyes open – in my listening room, I’m continually distracted by a large, bright, colourful visual field – see the photo of daytime view from my listening seat.
You probably do the next two fairly often:
- Eyes closed, room lit – you can still sense some light as the eyelids are not perfect blockers (probably for good reason).
- Light off/blacked out by curtains – try this with eyes open, then with eyes closed.
What if you take it further?
- Blackout blindfold – I don’t normally have a use for a blindfold (honest). This one is a leftover from a long-haul flight!
Another possibility is to try to alter your normal sensorium to make your hearing primary as the source of stimulus with your brain in a somewhat unusual but potentially advantageous state. You can use a Ganzfeld mask to induce the required brain state, in a form of perceptual deprivation that puts the brain into a better listening mode by quieting it of the usual bombardment of sensory stimuli, and especially the light and sight that has become dominant in our everyday experience. Removing visual stimulus (distraction) with a blackout mask would assist in focusing attention to aural perception – a state of calm relaxation. The Ganzfeld access to a ‘quietened’ mind would seem to take this further.
The Ganzfeld principle is that the mask presents a plain field of vision with no features other than the colour of the light. The visual input to the brain is constant and uniform stimulation. The ‘complete field’ Ganzfeld effect is perceptual deprivation, caused by exposure to an unstructured, uniform stimulation field. In this condition, the brain amplifies neural noise in ‘looking’ (note the visual conception in play again in our daily language – ‘yes, I see’) for the missing visual signals. The neural noise is interpreted in the higher visual cortex of the brain as real sensory information, and gives rise to an altered state of consciousness, and hallucinations if extended beyond a threshold. The visual effect is described as loss of vision as the brain cuts off the unchanging signal from the eyes. The result is “seeing black”.
Whereas wearing a sleeping mask or sitting in a dark room puts the brain to sleep, the brain needs to stay awake and with minimal sensory input to induce hallucination. This can be done safely, but the main uses of a Ganzfeld mask are to bring on instant meditation, and hypnosis. It’s a self-healing tool for entering the Alpha-Theta brain state. Theta waves (3-8 Hz) occur during sleep but have also been observed in the deepest states of Zen meditation. Alpha waves (8-12 Hz) are present when your brain is in an idling default-state typically created when you’re daydreaming or consciously practicing mindfulness or meditation. In the Alpha state, your brain is in deep relaxation and meditation. Alpha is the bridge between Beta and Theta. Alpha waves govern daydreams, fantasy, and denote a state of consciousness detached and relaxed. Theta is a state of very deep relaxation of the mind where it is believed you can create and change reality instantly. The Beta wave (14-28 Hz) is the state in which you are active and alert. As the Alpha state is conducive to stress release, relaxation, deep meditation for well-being, increasing creativity and super learning, it could positively affect music listening. I used it to stop information overload and to quiet my mind for deep music listening.
While there seems to be no specific colour effect in the Ganzfeld effect, the choice might be guided by light therapy thinking in Chromotherapy. A red field connects with our grounding and survival instincts, and is associated with alertness, excitement, motion, and aggression, whereas a yellow field accesses power and ego, and is associated with stimulation, confidence, and happiness. A blue field connects with communication, and is associated with calming, serenity and sadness. (I got my Ganzfeld Light mask from Life-Changer Limited.)
As Marshall McLuhan also pointed out 50 years ago, “The hi-fi quest is for ‘realistic sound’ – the sensation of having the performing instruments ‘in the room with you’… To be in the presence of performing musicians is to experience their touch and handling of instruments as tactile and kinetic, not just as resonant… Hi-fi is not any quest for abstract effects of sound in separation from the other senses.” (Marshall McLuhan (1964) Understanding Media: The Extensions Of Man, p. 300).
Western music listening of the Greater Sweet-spotted Audiophile emphasises stillness and attention, thus a fixed perspective. The listening experience is continuous, ever-present, and unavoidable, and less controllable than viewing. You can’t (easily, at least) close your ears! We are switched on listeners from birth.
Some vendors are cashing-in on the shift to a visual emphasis in our culture by deliberately creating a connoisseur niche through their ‘artisan’ terminology and mega-prices, where the fabulous handcrafted cabinet or milled block of rare platinum costs way more than the signal amplification or air movers.
I’ve written elsewhere about the supposed death of hi-fi, but this is really the decline and demotion of aural emphasis in a visual culture, just as jobs have shifted from largely full-time continuing to part-time fixed-term. Music engagement for many listeners has become little more than entertainment. When I began following music in the late 1960s, it was something you heard first, then saw. Today, music is seen, then heard. Listening has to be work for art.
Note: No songwriters were harmed in the mis-quoting of their lyrics.
STOP PRESS: I have just seen a new product called ISOLATE Mini ear silencers that “switch off your ears”. Details at www.flareaudio.com
* Dr Richard Varey writes about the electro-mechanics and the social psychology of this technology-facilitated art we call high-fidelity music reproduction.