Dictionaries clearly do not define reality and most certainly do not reveal how words are used on the street. We do not live within the neat and orderly context of a dictionary. Life is infinitely more complicated and messy. We all use and misuse words at random as the irrational inclination of the moment dictates. We so often seem to agree about the unimportant things (by ignoring the detail), that we assume and confidently expect all our words ought to make sense.
Knowing all the while, from bitter experience, that this is patently absurd. If language really were such a reliable tool and universally used to always unerringly say exactly what we mean, there would have been no need for all those shaman, theologians, philosophers, lawyers and politicians, to whom the truth is an illusion, anathema to their raison d’être.
Contrary to popular mythology, language most certainly does not work in the sense that it cannot fulfill our naïve expectations. That said, I am well aware of the brazen impertinence of relying on language to say so. Nevertheless, it is my sincere intention to demonstrate — by reminding you of what you already know — that these words, like any other words, do not, because they cannot, convey meaning. Whatever sense you make of this text is going to be all your own work, I’m afraid. I cannot be held responsible for your perception, nor you for mine. Besides which, by tomorrow or next year I will have quite forgotten what I might have meant by what I write today.
Language is a dumb tool. Like a hammer, an eggbeater or an army. Tools lie still and do nothing until you or I ‘make use’ of them. We use language to ‘make sense.’ The human brain has evolved to quickly learn to make sense of sensory data, including spoken language, tirelessly ‘joining the dots,’ associating sights with sounds and tastes and smells, touch and emotional responses. This is only possible in close conjunction with a vast, rapidly expanding, readily accessible library, or database we call ‘memory.’
But language clearly does not allow us to say whatever we like. For example, I believe each person is unique. There never was a person like you or me, nor will be again. Each human brain meticulously lays down its own referential database (memory) according to the sensory data obtained from highly specific, never to be repeated private experiences. But I literally cannot sensibly say, like the Michael Palin character in Monty Python’s ‘Life of Brian,’ “we are all individuals,” without evoking a glaring oxymoron.
We are all unique, but we are not born that way. A newborn infant has no concept of being a discrete individual. When we first learned to speak, we mimicked our elders by referring to what would eventually metamorphose into the elaborate linguistic construct of the ‘self’ by slavish recitation, as in, “Billy come too?” Once the child has mastered the first person pronoun “I,” there is no looking back. Thereafter we are obliged to adopt what Michel Foucault liked to call the “enunciative modality” of the “I” discursive position of speech. From there we are irretrievably caught within the rigid protocol of a stage performance, in which each narrative dialogue concerns only three characters.
The first and most important character in any narrative is the first person “I.” I am the only one who can speak. The evocation of “I” builds into any text an unmistakable sense of ‘self,’ ‘individual personality,’ ‘mind,’ ‘consciousness,’ ‘awareness,’ ‘moral authenticity and liability,’ and, inevitably, ‘the soul.’ These are all inescapably derived from the mere act of saying “I am.” The world and the known universe, the whole box and dice revolves literally and therefore most significantly around the first person (Adam). That is to say, anyone who says “I am.” We each must adopt this unique persona in order to say anything at all.
That is how language restricts what you can and what you cannot say. The enunciation of the first person “I” immediately and inescapably evokes the second person “you” as the person spoken to. “I” may even speak to my “self,” as in, “you idiot!” No matter what the social situation, the speaker has no option but to speak as “I” addressing “you.” The only other character on stage is the third person (s/he, they or it), the person spoken about, who remains silent throughout. Each of these three roles is written and performed in the singular. When public figures address crowds, “you” is always understood as a singular entity.
From which we get the rather unnerving fiction that a corporation can actually be possessed of a singular ‘mind’ and therefore one, single-minded, moral liability. Which is never seen on the street. A crowd never behaves, because crowds are not so like-minded, as a coherent, morally competent individual, but rather as an instinctively driven herd. To which every avid sports fan pays undying fealty, only to vehemently disown any socially unacceptable mob behavior in the news.
The grammatically plural, first person pronoun ‘we’ is always singularly treated as a cohesive, morally coherent collective, that can only ever be understood as nominative of an allegedly unified ‘corporate body.’ Which certainly lives quite comfortably within the logical narrative of every text, but is also never seen on the street. To say, “we are all individuals” is just one of the many paradoxes that lurk within the strict limitations of language, governed as it is by the inflexible rules of grammar and syntax. To be sure, without that rigidly reliable structure we could not say anything intelligible at all. For a semantic code to work as efficiently as ours does, both sender and receiver must be able to rely on one agreed protocol.
Which means I can never simply say what I want to say. There are far too many unavoidable rhetorical land mines inherent in a necessarily limited vocabulary. While there can be no question that “you” and “I” appear to have a lot in common, each is nevertheless undeniably unique. No two people have ever shared the same cradle, precisely the same upbringing, education or life experience. Each cerebral database accumulates those memories pertaining only to the experiences of the one person wholly dependent upon that brain, “you” in your small corner and “I” in mine.
It is frequently claimed that, as one can buy a horse or a house, negotiate a financial transaction, make a logical proposal of marriage, or govern a theoretically constituted polity, declare war and make peace, the unquestioned efficacy of all such common instances of human intercourse proves beyond all reasonable doubt that it is clearly not only eminently feasible but indeed absolutely imperative to convey all manner of conceptual ideas by means of spoken and written language. And what a classic fallacy it is.
Surely, the universal subscription to this simplistic premise is, for all its popular currency, to do a grave injustice to the nascent creativity, natural versatility, and innate individuality of each human brain. Everybody knows that no two people, no matter what their renowned eloquence and linguistic competence, have ever thought alike. Let alone come to the same conclusions via identical reasoning.
All knowledge is after all, language-based. Quite impossible, obviously, to know anything at all unless one is apprised of the appropriate, currently stereotyped words. That being so, and given that every language is a strictly ordered, grammatically determined, rigorously structured semantic system of conventional clichés, parochial metaphors and popular aphorisms, this inescapable characteristic common to all such commercially honored, pedestrian codes has the endearing and thoroughly beguiling appearance of politically convenient, rationally coherent and indisputable clarity of ‘communication.’ Which we know is nevertheless a historically persistent, perniciously pervasive and therefore admittedly quite indispensable illusion. And, as we also know, no illusion has ever failed to breed every pertinacious delusion.
This unshakable mythical ruse relies absolutely on a universally accepted and strictly observed, deeply embedded discipline that continuously combines countless, unwritten socio-culturally significant rules of syntax with ever-shifting lexical fashions of language practice. All such comfortably familiar, suitably communal and pragmatically consensual protocols invariably rely in turn on innumerable, imaginative and legendary, if quite baseless, grand assumptions and immutably traditional superstitions. The resulting acutely embarrassing, notoriously ambiguous structure is what we nevertheless most often keep proudly and publicly, though never privately, persisting in calling ‘civilization.’ And what a despicably ramshackle house of cards that monstrosity has turned out to be.
All of it is made possible by ignoring, or more generally dismissing out of hand, any oblique or explicit allusion to the inherently unique Point Of View of each participant. No two persons being alike, each has, from conception and birth, arrived where we find ourselves today, entirely and necessarily by our own, intensely private and highly improbable circuitous route. Each colorful narrative a work in progress derived from one’s own idiosyncratic and inimitable memory, upon which we must each implicitly rely if we are to make any sense of all we see and hear. And the sense we make depends on three essential ingredients: the pretext, the context and the subtext.
Each private biography lends relevance to the essential pretext each reader/listener brings to any human interaction, formal and informal presentation, gifts, music, dance, gesture, sign and facial expression. Without our readily accessible memory and fertile imagination, we could literally not make sense of any sensory data. But whatever sense we do make is always only ever going to be entirely my own work.
Which in turn depends on the immensely complex context surrounding each and every unique, never-to-be-repeated situation, every time. The context consists, amongst a bewildering myriad of details, of all the circumstantial evidence that we implicitly depend upon, without which we quite simply could not make sense of the raw data of any spoken and written text, no matter how solemn or flippant.
The context includes written style, vocabulary, tone of voice, inflection, accent, facial expression, gesture, background and ambient sound, risk factors, socially in/appropriate behavior, traffic, weather and simultaneous activity such as eating and drinking, buying and selling, walking and dancing and so forth.
The context also involves things like time of day, mood, blood sugar, state of health, personal agenda, financial circumstances, romantic and/or sexual ambitions and, remarkably, your very own centre of gravity. Three tiny fluid-filled tubes inside your inner ear allow the brain to determine the location of the centre of the Earth. Without that faculty, we would not know who we are, or where we are.
Our tireless brain is not concerned with “what’s out there.” The brain does not even know it is a brain. Does not care, in fact, if you live or die. Any more than your heart, lungs kidneys and liver have any vested interest in your welfare. That’s not what your bodily organs are for. Your entire physical organism responds to stimuli according to their genetically determined function. That’s all. All our emotional baggage and moral imperatives are rhetorical devices. All that so-called ‘psycho-babble’ of ‘self awareness’ has everything to do with language-based ‘consciousness,’ which develops slowly over time.
While your brain does not care where you are, or who you are, it does need a reliable fix on which way is up. Otherwise we could not sit, walk, talk and chew gum. Without this simple, gravity-dependent inner ear mechanism (or when it’s irreparably damaged), we literally cannot think straight and certainly not make any rational sense of all our sensory input.
But whatever sense we do make is entirely our own and nothing like the sense anybody else makes of what looks like the same data arising in what is carelessly assumed to be the same situation. No two distinct things or events, being by definition wholly separate and discrete, can ever be exactly the same.
And then there’s the subtext. This is where we ultimately and finally obtain our enduring, presumably reliable sense of meaning of all we see and hear. That is, the intuitive, apparently effortless translation and interpretation by each individual brain of every sight and sound detected ‘between the lines,’ all the really important stuff that is finally understood but that was patently not literally nor explicitly “there.”
The only possible meaning, that is unique to each participant in what we call ‘human communication,’ is entirely and essentially deduced, with reference to all the previous experiential data instantly and imperceptibly recalled from memory and individually illuminated by our imagination. Therefore, notwithstanding what it feels like, meaning does not reside in the text, but is the exclusive individual sense that each of us is obliged to make, according to each unique whole-of-life experience.
And to make matters really interesting, we can never be sure who anybody is. We can never know the true identity of the ostensible writer of any text, book, article, lecture, speech or sermon. “I myself” don’t even know who “I” am supposed to be, whenever “I” say “I am.” Nor should we care. “The world’s a stage and we the players in it.” We are always playing a scripted part. All we have to work with is what we see and hear. What we read and write. What “you” say to “me.” And what “I” say to “you.” We can never really know any more than what we read and what we are told. Which is never going to be the truth. Certainly not the whole truth. In fact, everything we read and hear is anything but the truth.
What we really mean by what we say and write and what we really understand from what we read and are told, must remain forever moot. Our intensely private experience of everything we perceive, we know not how and care less, at least what we can remember and subsequently keep redefining is, by definition, inaccessible to conventional language. We have little choice but to implicitly believe that words always tell the whole story. When in fact, we know the real substance and significance of true understanding, on which the sum of our very existential realities utterly depend, is quite beyond the scope of all the politically correct, morally justifiable clichés upon which we are nevertheless thoroughly, socio-culturally habituated, and therefore irrevocably obliged, to rely. The paradox inherent in all language is universally conventional.
So we keep talking, more than listening, reading and writing. Carelessly but selectively. Blotting out anything we can’t quite accommodate and forgetting to mention what doesn’t quite fit our coherent narrative. Pretending all along that “what I say” at least makes perfect sense to me. And so it should to “you.” Even though we can never quite escape the persistent, unsettling uncertainty as to what the “other” really meant by what s/he said and/or wrote. Every time. Hypocritical? Sure. Welcome to the family.
It seems to me that our experience of so-called ‘contemplation’ and ‘thought’ is nothing more or less than the product of the brain’s genetically determined predisposition for generating language, in endlessly random loops of partly remembered dialogues and bits of passages from books and discussions. This constant buzz of linguistic recitation, which may be what we remember as ‘dreaming’ and idle musing, in which the “I” takes centre stage, is what may well be what we associate with our much-lauded, yet elusive ‘consciousness,’ our precious ‘sense of self’ and that most beguiling of human qualities, ‘my personality.’
So the brain keeps talking, as though “I” am talking to “me,” there being nothing quite as reassuring as the sound of our own voice. As all desperately lonely and isolated people know intimately well. There is often, not always, real therapeutic value in the habitual practice of keeping a diary. Writing down what is randomly going on “upstairs,” where the action is. Because, as we like to keep comforting our “selves,” consciousness is where “I” live. My sense of “self,” we keep repeating, never sleeps. Even as we know, beyond the shadow of a doubt, that it ain’t so. As is so typically paradoxical of the inflexible semantic structure of language, so-called “self-denial” can, by definition, have nothing whatsoever to do with “me.” Obviously, if “I” am not there to say so, how am “I” to deny my “self”? Cogito? No. Dico, ergo sum.
Daniel Dennett has a deflationary theory of the “self.” To him, selves are not physically detectable. Instead, they are a kind of rhetorical confection, like a centre of gravity, convenient as a way of solving physics problems, although they need not correspond to anything tangible — the centre of gravity of a hoop is a point in thin air. People constantly tell themselves stories to make sense of their world, and they feature in the stories as a character, and that convenient but fictional character is the self.
A neat demonstration, if one were needed, of the limitations of language: if the air really were as thin as “thin air,” heavier-than-air flying machines could not fly. And the sound of the human voice could not travel from mouth to ear.
Sal Scilicet (subtext: Dico, ergo sum) sometimes see himself as a rabid iconoclast. Then an instinctive Pyrrhonist. This changes with the weather. Sal holds a degree in linguistics (1983) and another in Social Work (1992). He is familiar with four languages and a number of derivative dialects. Physically and ironically, he lives in Australia; emotionally and ideologically, he most emphatically does not.