On the philosophy of language — Part II

languageby Sal Scilicet

Dictionaries clearly do not define reality and most certainly do not reveal how words are used on the street. We do not live within the neat and orderly context of a dictionary. Life is infinitely more complicated and messy. We all use and misuse words at random as the irrational inclination of the moment dictates. We so often seem to agree about the unimportant things (by ignoring the detail), that we assume and confidently expect all our words ought to make sense.

Knowing all the while, from bitter experience, that this is patently absurd. If language really were such a reliable tool and universally used to always unerringly say exactly what we mean, there would have been no need for all those shaman, theologians, philosophers, lawyers and politicians, to whom the truth is an illusion, anathema to their raison d’être.

Contrary to popular mythology, language most certainly does not work in the sense that it cannot fulfill our naïve expectations. That said, I am well aware of the brazen impertinence of relying on language to say so. Nevertheless, it is my sincere intention to demonstrate — by reminding you of what you already know — that these words, like any other words, do not, because they cannot, convey meaning. Whatever sense you make of this text is going to be all your own work, I’m afraid. I cannot be held responsible for your perception, nor you for mine. Besides which, by tomorrow or next year I will have quite forgotten what I might have meant by what I write today.

Language is a dumb tool. Like a hammer, an eggbeater or an army. Tools lie still and do nothing until you or I ‘make use’ of them. We use language to ‘make sense.’ The human brain has evolved to quickly learn to make sense of sensory data, including spoken language, tirelessly ‘joining the dots,’ associating sights with sounds and tastes and smells, touch and emotional responses. This is only possible in close conjunction with a vast, rapidly expanding, readily accessible library, or database we call ‘memory.’

But language clearly does not allow us to say whatever we like. For example, I believe each person is unique. There never was a person like you or me, nor will be again. Each human brain meticulously lays down its own referential database (memory) according to the sensory data obtained from highly specific, never to be repeated private experiences. But I literally cannot sensibly say, like the Michael Palin character in Monty Python’s ‘Life of Brian,’ “we are all individuals,” without evoking a glaring oxymoron.

We are all unique, but we are not born that way. A newborn infant has no concept of being a discrete individual. When we first learned to speak, we mimicked our elders by referring to what would eventually metamorphose into the elaborate linguistic construct of the ‘self’ by slavish recitation, as in, “Billy come too?” Once the child has mastered the first person pronoun “I,” there is no looking back. Thereafter we are obliged to adopt what Michel Foucault liked to call the “enunciative modality” of the “I” discursive position of speech. From there we are irretrievably caught within the rigid protocol of a stage performance, in which each narrative dialogue concerns only three characters.

The first and most important character in any narrative is the first person “I.” I am the only one who can speak. The evocation of “I” builds into any text an unmistakable sense of ‘self,’ ‘individual personality,’ ‘mind,’ ‘consciousness,’ ‘awareness,’ ‘moral authenticity and liability,’ and, inevitably, ‘the soul.’ These are all inescapably derived from the mere act of saying “I am.” The world and the known universe, the whole box and dice revolves literally and therefore most significantly around the first person (Adam). That is to say, anyone who says “I am.” We each must adopt this unique persona in order to say anything at all.

That is how language restricts what you can and what you cannot say. The enunciation of the first person “I” immediately and inescapably evokes the second person “you” as the person spoken to. “I” may even speak to my “self,” as in, “you idiot!” No matter what the social situation, the speaker has no option but to speak as “I” addressing “you.” The only other character on stage is the third person (s/he, they or it), the person spoken about, who remains silent throughout. Each of these three roles is written and performed in the singular. When public figures address crowds, “you” is always understood as a singular entity.

From which we get the rather unnerving fiction that a corporation can actually be possessed of a singular ‘mind’ and therefore one, single-minded, moral liability. Which is never seen on the street. A crowd never behaves, because crowds are not so like-minded, as a coherent, morally competent individual, but rather as an instinctively driven herd. To which every avid sports fan pays undying fealty, only to vehemently disown any socially unacceptable mob behavior in the news.

The grammatically plural, first person pronoun ‘we’ is always singularly treated as a cohesive, morally coherent collective, that can only ever be understood as nominative of an allegedly unified ‘corporate body.’ Which certainly lives quite comfortably within the logical narrative of every text, but is also never seen on the street. To say, “we are all individuals” is just one of the many paradoxes that lurk within the strict limitations of language, governed as it is by the inflexible rules of grammar and syntax. To be sure, without that rigidly reliable structure we could not say anything intelligible at all. For a semantic code to work as efficiently as ours does, both sender and receiver must be able to rely on one agreed protocol.

Which means I can never simply say what I want to say. There are far too many unavoidable rhetorical land mines inherent in a necessarily limited vocabulary. While there can be no question that “you” and “I” appear to have a lot in common, each is nevertheless undeniably unique. No two people have ever shared the same cradle, precisely the same upbringing, education or life experience. Each cerebral database accumulates those memories pertaining only to the experiences of the one person wholly dependent upon that brain, “you” in your small corner and “I” in mine.

It is frequently claimed that, as one can buy a horse or a house, negotiate a financial transaction, make a logical proposal of marriage, or govern a theoretically constituted polity, declare war and make peace, the unquestioned efficacy of all such common instances of human intercourse proves beyond all reasonable doubt that it is clearly not only eminently feasible but indeed absolutely imperative to convey all manner of conceptual ideas by means of spoken and written language. And what a classic fallacy it is.

Surely, the universal subscription to this simplistic premise is, for all its popular currency, to do a grave injustice to the nascent creativity, natural versatility, and innate individuality of each human brain. Everybody knows that no two people, no matter what their renowned eloquence and linguistic competence, have ever thought alike. Let alone come to the same conclusions via identical reasoning.

All knowledge is after all, language-based. Quite impossible, obviously, to know anything at all unless one is apprised of the appropriate, currently stereotyped words. That being so, and given that every language is a strictly ordered, grammatically determined, rigorously structured semantic system of conventional clichés, parochial metaphors and popular aphorisms, this inescapable characteristic common to all such commercially honored, pedestrian codes has the endearing and thoroughly beguiling appearance of politically convenient, rationally coherent and indisputable clarity of ‘communication.’ Which we know is nevertheless a historically persistent, perniciously pervasive and therefore admittedly quite indispensable illusion. And, as we also know, no illusion has ever failed to breed every pertinacious delusion.

This unshakable mythical ruse relies absolutely on a universally accepted and strictly observed, deeply embedded discipline that continuously combines countless, unwritten socio-culturally significant rules of syntax with ever-shifting lexical fashions of language practice. All such comfortably familiar, suitably communal and pragmatically consensual protocols invariably rely in turn on innumerable, imaginative and legendary, if quite baseless, grand assumptions and immutably traditional superstitions. The resulting acutely embarrassing, notoriously ambiguous structure is what we nevertheless most often keep proudly and publicly, though never privately, persisting in calling ‘civilization.’ And what a despicably ramshackle house of cards that monstrosity has turned out to be.

All of it is made possible by ignoring, or more generally dismissing out of hand, any oblique or explicit allusion to the inherently unique Point Of View of each participant. No two persons being alike, each has, from conception and birth, arrived where we find ourselves today, entirely and necessarily by our own, intensely private and highly improbable circuitous route. Each colorful narrative a work in progress derived from one’s own idiosyncratic and inimitable memory, upon which we must each implicitly rely if we are to make any sense of all we see and hear. And the sense we make depends on three essential ingredients: the pretext, the context and the subtext.

Each private biography lends relevance to the essential pretext each reader/listener brings to any human interaction, formal and informal presentation, gifts, music, dance, gesture, sign and facial expression. Without our readily accessible memory and fertile imagination, we could literally not make sense of any sensory data. But whatever sense we do make is always only ever going to be entirely my own work.

Which in turn depends on the immensely complex context surrounding each and every unique, never-to-be-repeated situation, every time. The context consists, amongst a bewildering myriad of details, of all the circumstantial evidence that we implicitly depend upon, without which we quite simply could not make sense of the raw data of any spoken and written text, no matter how solemn or flippant.

The context includes written style, vocabulary, tone of voice, inflection, accent, facial expression, gesture, background and ambient sound, risk factors, socially in/appropriate behavior, traffic, weather and simultaneous activity such as eating and drinking, buying and selling, walking and dancing and so forth.

The context also involves things like time of day, mood, blood sugar, state of health, personal agenda, financial circumstances, romantic and/or sexual ambitions and, remarkably, your very own centre of gravity. Three tiny fluid-filled tubes inside your inner ear allow the brain to determine the location of the centre of the Earth. Without that faculty, we would not know who we are, or where we are.

Our tireless brain is not concerned with “what’s out there.” The brain does not even know it is a brain. Does not care, in fact, if you live or die. Any more than your heart, lungs kidneys and liver have any vested interest in your welfare. That’s not what your bodily organs are for. Your entire physical organism responds to stimuli according to their genetically determined function. That’s all. All our emotional baggage and moral imperatives are rhetorical devices. All that so-called ‘psycho-babble’ of ‘self awareness’ has everything to do with language-based ‘consciousness,’ which develops slowly over time.

While your brain does not care where you are, or who you are, it does need a reliable fix on which way is up. Otherwise we could not sit, walk, talk and chew gum. Without this simple, gravity-dependent inner ear mechanism (or when it’s irreparably damaged), we literally cannot think straight and certainly not make any rational sense of all our sensory input.

But whatever sense we do make is entirely our own and nothing like the sense anybody else makes of what looks like the same data arising in what is carelessly assumed to be the same situation. No two distinct things or events, being by definition wholly separate and discrete, can ever be exactly the same.

And then there’s the subtext. This is where we ultimately and finally obtain our enduring, presumably reliable sense of meaning of all we see and hear. That is, the intuitive, apparently effortless translation and interpretation by each individual brain of every sight and sound detected ‘between the lines,’ all the really important stuff that is finally understood but that was patently not literally nor explicitly “there.”

The only possible meaning, that is unique to each participant in what we call ‘human communication,’ is entirely and essentially deduced, with reference to all the previous experiential data instantly and imperceptibly recalled from memory and individually illuminated by our imagination. Therefore, notwithstanding what it feels like, meaning does not reside in the text, but is the exclusive individual sense that each of us is obliged to make, according to each unique whole-of-life experience.

And to make matters really interesting, we can never be sure who anybody is. We can never know the true identity of the ostensible writer of any text, book, article, lecture, speech or sermon. “I myself” don’t even know who “I” am supposed to be, whenever “I” say “I am.” Nor should we care. “The world’s a stage and we the players in it.” We are always playing a scripted part. All we have to work with is what we see and hear. What we read and write. What “you” say to “me.” And what “I” say to “you.” We can never really know any more than what we read and what we are told. Which is never going to be the truth. Certainly not the whole truth. In fact, everything we read and hear is anything but the truth.

What we really mean by what we say and write and what we really understand from what we read and are told, must remain forever moot. Our intensely private experience of everything we perceive, we know not how and care less, at least what we can remember and subsequently keep redefining is, by definition, inaccessible to conventional language. We have little choice but to implicitly believe that words always tell the whole story. When in fact, we know the real substance and significance of true understanding, on which the sum of our very existential realities utterly depend, is quite beyond the scope of all the politically correct, morally justifiable clichés upon which we are nevertheless thoroughly, socio-culturally habituated, and therefore irrevocably obliged, to rely. The paradox inherent in all language is universally conventional.

So we keep talking, more than listening, reading and writing. Carelessly but selectively. Blotting out anything we can’t quite accommodate and forgetting to mention what doesn’t quite fit our coherent narrative. Pretending all along that “what I say” at least makes perfect sense to me. And so it should to “you.” Even though we can never quite escape the persistent, unsettling uncertainty as to what the “other” really meant by what s/he said and/or wrote. Every time. Hypocritical? Sure. Welcome to the family.

It seems to me that our experience of so-called ‘contemplation’ and ‘thought’ is nothing more or less than the product of the brain’s genetically determined predisposition for generating language, in endlessly random loops of partly remembered dialogues and bits of passages from books and discussions. This constant buzz of linguistic recitation, which may be what we remember as ‘dreaming’ and idle musing, in which the “I” takes centre stage, is what may well be what we associate with our much-lauded, yet elusive ‘consciousness,’ our precious ‘sense of self’ and that most beguiling of human qualities, ‘my personality.’

So the brain keeps talking, as though “I” am talking to “me,” there being nothing quite as reassuring as the sound of our own voice. As all desperately lonely and isolated people know intimately well. There is often, not always, real therapeutic value in the habitual practice of keeping a diary. Writing down what is randomly going on “upstairs,” where the action is. Because, as we like to keep comforting our “selves,” consciousness is where “I” live. My sense of “self,” we keep repeating, never sleeps. Even as we know, beyond the shadow of a doubt, that it ain’t so. As is so typically paradoxical of the inflexible semantic structure of language, so-called “self-denial” can, by definition, have nothing whatsoever to do with “me.” Obviously, if “I” am not there to say so, how am “I” to deny my “self”? Cogito? No. Dico, ergo sum.

Daniel Dennett has a deflationary theory of the “self.” To him, selves are not physically detectable. Instead, they are a kind of rhetorical confection, like a centre of gravity, convenient as a way of solving physics problems, although they need not correspond to anything tangible — the centre of gravity of a hoop is a point in thin air. People constantly tell themselves stories to make sense of their world, and they feature in the stories as a character, and that convenient but fictional character is the self.

A neat demonstration, if one were needed, of the limitations of language: if the air really were as thin as “thin air,” heavier-than-air flying machines could not fly. And the sound of the human voice could not travel from mouth to ear.

_____

Sal Scilicet (subtext: Dico, ergo sum) sometimes see himself as a rabid iconoclast. Then an instinctive Pyrrhonist. This changes with the weather. Sal holds a degree in linguistics (1983) and another in Social Work (1992). He is familiar with four languages and a number of derivative dialects. Physically and ironically, he lives in Australia; emotionally and ideologically, he most emphatically does not.

Advertisements

107 thoughts on “On the philosophy of language — Part II

  1. And now for the denouement.

    Ladies and gentlemen. This was not a hoax. [Do not adjust your set.] It certainly wasn’t much fun either. In fact, it was bloody hard work. My only consolation is that I believe I have demonstrated, admirably well, if I do say so, how tenuous our much-vaunted linguistic sophistication and subsequent hold on a more or less serviceable consensus on the true nature of “reality” really is.

    I believe what we are witnessing today is a relentless breakdown in the vast array of public assumptions that seem, in living memory, to have served us so well. The medieval certainties will not sustain us now. Not only in the United States, but world-wide. Not long ago, it seems, “love and marriage went together like a horse and carriage”. Same-sex marriage was not only literally unthinkable, but a denial of some law of nature or other. God was in his heaven and all was right with the world.

    What has happened, I believe, is the Internet. Twenty years ago, such remarkable efficiency with which the overwhelming flood of viscerally sincere discussions take place all over the ‘blogosphere’, was not possible. We had time to think. Let things sink in. Allow the seeds of dissent to germinate. And in the fullness of time we might deliver our eminently sensible intellect of a measured, civilised response. Those genteel [“good old”?] days, for better or for worse, are gone for good.

    The human brain, while prodigious in its output, has severe limitations. Which, I suggest, is just as well. The way electronic devices manage to “talk to each other”, at the speed of light over quite irrelevant distances, would render perfectly catatonic even the brain of an Einstein. [He it was who begged his students to explain things to him slowly.] The average blog thread enjoys a half life of barely a week. By the time a thoroughly decent, thinking person has had time to compose a reasoned response, the noisy caravan has rolled on to the Next Big Thing.

    That said. As evidence of how our benighted ‘human communication’ has broken down, I would like to submit a verbatim digest below of the email traffic that has recently transpired between Massimo and one pugnacious “Theo Wit”. [My God! Isn’t that his real name?]

    Massimo was roundly taken in by my essay. He saw sufficient merit in it to publish it on his website. For which I am deeply indebted to him. But, when “the shit hit the fan”, he predictably ducked for cover. As we all do. The self preservation imperative is written indelibly in our genes. It’s what got us all the way to here, as I said. The emails reprinted below, tell the story.

    Ladies and gentlemen of the jury. You have been unwitting players in a deliberate, elaborate conceit. [As distinct from a hoax, which bears malicious connotations. This was a simple ruse. A device for demonstrating something most elementary, which, it was hoped, the participants are bound to recognise from personal experience. Hence my allusion to “reminding you of what you already know”.]

    This method, I might add, is far from unusual. Old as language itself, in fact. Because all language everywhere lends itself so eminently well to simple deception by means of “double-speak”. By virtue of the very linguistic competence we so pride ourselves in, we are all daily dragged into all sorts of contrived scenarios, which we are obliged to take seriously. Lest we appear “incompetent”. A popular put-down.

    Let me explain. “Theo Wit” is a play on “God’s jester”. “Sal Scilicet” is Latin. Sal: “To Wit”, or common salt, as in the salient thrust of the argument. Scilicet: “that is to say”. “Dico, ergo sum” roughly translates as, “I comment, therefore I am”. I don’t know who the face in my avatar belongs to, but there was a loud hint hidden in the warning, “it’s obviously not Marilyn Monroe”.

    My text was deliberately packed full of hints that something was going on, other than what the author seemed to be saying. With such evident passion. My opening line referred to a bar-room brawl. Which was reflected in the reference to ‘Blazing Saddles’. And so on. The deliberately oblique clues are too numerous to mention all of them here. The formula is well-known everywhere: say something provocative and watch “the usual suspects” jump.

    Three millennia ago, the Etruscans invented the idea of “the persona”. They devised the Etruscan mask to enable an actor to “represent” a character on stage. The identity of the actor was irrelevant. A thousand years later, the Romans borrowed this concept and made it their own. The Greeks wisely treated all theatre as subversive. Aristotle turned the Etruscan idea of “persona” into “catharsis”: the idea that we allow the actor to “represent” us. [When Arnie finally dies, we die with him.]

    As the lights dim in the cinema, we willingly allow ourselves to be transported into a dream state. As we become completely absorbed by the action on screen, we are blissfully unaware of our surroundings. We emerge from the darkness into the glare and bustle on the street, as from a hypnotic state, our brain needing time to readjust our moral GPS.

    Bertolt Brecht chose to emphasise this phenomenon by deliberately letting the audience in on the ruse. This has been repeatedly emulated to hilarious effect by the likes of Mel Brooks. I mentioned the barroom brawl in my opening line. [The little old lady being beaten up in ‘Blazing Saddles’, turning to the camera, says, “have you ever seen such cruelty?” And the guy being dragged by his horse: “that’s the end of that suit”.] Bob Hope and Bing Crosby employed the same technique in their road movies. [As Bing embraced the damsel in distress, Bob would pop his head into frame and say, “Now’s the time to get your pop-corn kids”.]

    Like

  2. And we fall for it every time. We all remember, how, as children, the line, “Once upon a time …” was irresistible. [It still is.] Not as a warning that what you are about to hear is pure bunk. But as an invitation to enter Fantasy Land, where John Wayne actually gets the girl and they live happily ever after. [The Man Who Shot Liberty Valance.]

    My text was absolutely littered with subtle and not so subtle hints. Alas, thanks to the immediacy of the Internet, I’m afraid the art of subtlety and nuance is sinking fast, even as we speak. The strict grammatical structure of language naturally lends itself to deception. It’s so easy, as we all know, to lead your audience astray, that we ought to be aware of what’s going on. That was my subtext. Nevertheless, it took me many hours to work my double-speak into some credible semblance of polished prose.

    My text repeatedly reminded readers that, “on the Internet, nobody knows you’re [not] a dog”. Then came the “Usual Suspects”. That classic, Kevin Spacey vehicle, in which nothing is as it seems. [Pete Postlethwaite’s craggy face was perfect for the double act.]

    Shakespeare used this device with inimitable skill. He understood that his unsophisticated audiences had no trouble reading between the lines. He often used the innocent court jester [Lear’s fool; the gatekeeper in Macbeth] to speak what the protagonists could not.

    I made references to the ancient Bible writers and the ‘Pesher Method’, hiding intended meaning inside a simple narrative. Moses and the burning bush, in which comes the instruction, “tell Pharaoh that ‘I AM’ sent you”. In Hebrew, the present tense of the verb to be is inherently [possibly deliberately] ambiguous. “I am” can also be read as “I will be”.

    Then came the reference to Michael Crichton: “Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you’re being had.” [Send not to know for whom the bell tolls …]

    How many television dramas have poked us in the eyes with this obvious fact. [‘House of Cards’; ‘Yes, Minister’.] What you see is not what you get. President Obama not only never says what he means, he dare not. No politician worth his or her salt, can afford to assume that my words will convey what I want you to understand. The words simply cannot be trusted. So we tell a story … [Thereby hangs a tale.]

    Not only that. As the by-now iconic, mathematically inspired artwork of Maurits Cornelis Escher clearly demonstrates, not only your words will betray you, but your eyes will deceive you. And still we cry, “I saw it with me own eyes!”

    There is always a subtext. The public discourse, as transacted on the Internet, most loudly represented by the American college-grad ‘chattering classes’, simply cannot pause over the subtext. This is not new. Look at Christianity. Two thousand years of tradition, based entirely on a logical absurdity: the Crucifixion was both a good thing and a bad thing. Dietrich Bonhoeffer famously agonised over whether the proposed assassination of Hitler could be morally justified.

    The grammatical inflexibility of language renders us incapable of dealing with relativism. We can just never quite seem to get our head around the idea that sometimes murder is actually a good thing. That sometimes, for the love of God, it is absolutely essential to tell a lie. Stories of the European underground during the war are replete with such potentially paradoxical, moral dilemmas.

    In many of the loudest quarters, that simply will not wash. Today, all the most urgent and pressing social issues have been reduced to the simplistic, black and white rhetoric of a lynch mob. The Left cannot abide the Right. Democrat is damned by Republican. And vice versa. Pro-life is reviled by Pro-choice. The Calvinists and Baptists define themselves by treating the pope as the anti-Christ. And don’t get me started on the ‘gender wars’.

    The phrase ‘equal opportunity’ sounds genuine. Take the time to analyse it, and you inevitably find it’s pure rhetorical subversion. That male is equal to female and black is the same as white. That money will not make you happy. That Israel cannot occupy Palestine because there never was such a place. These are all eminently useful devices to create the illusion that there is a universal truth out there. A pervasive, wilful blindness to the pretext, context and subtext, on which we all depend to make sense.

    For what it’s worth, my underlying subtext was, “No Virginia, the truth is not out there. Be afraid, be very afraid”.

    Like

  3. August 12, 2014. 2:44:23 pm
    Massimo,

    I don’t know where to place this. So I’ll give it to you straight. On the understanding that I hope you might appreciate on-the-level feedback from the boondocks, on how your website is perceived in some quarters.

    You may publish this, or not, as and where you see fit.

    I have to say, having barely survived my disingenuous debut on your website, I cannot escape a bitter pervading after-taste. There is a distinct impression hanging in the air that I was set up. That I was naively induced to play the stereotypical fall-guy in a regular Abbot and Costello set piece.

    Have you ever been to a party where you find yourself standing with a group of guests who are talking about you in the third person? No? How do you think that feels?

    I hasten to admit that, yes, many commenters were quite effusive in their praise of my piece. As I have already intimated to you and elsewhere, due to chronic ill health, I don’t get out much. That is not intended as an excuse. Merely to correct some preconceptions out there. I need desperately to believe that any respectfully appreciative comments were genuine in their sentiment.

    But how many more, shall we say, ‘confident’ respondents addressed their comments, not to the author, but directly at you? “Massimo, is that you?” How would you feel, if complete strangers discuss the likelihood that you are the product of a conspiracy? The mere suggestion that my heartfelt submission was probably a hoax elicited alarming support.

    Whatever happened to the legendary American hospitality and academic civility? Or is this just part of the equally mythical OINYKOINY ethos so frequently and proudly identified with the abrasive intelligentsia of the Big Apple? I am uncomfortably familiar with the Academy Award Roasts and the fundamentally irreverent Seinfeld phenomenon.

    No doubt I’m being too precious. But where I come from, this is not just tantamount to poor etiquette. This is nothing short of blatant ostracism, based on that good old stand-by, xenophobia. What the farmer does not recognise, as the well-known German aphorism goes, he will not eat.

    You did deny that it was a hoax. But there we see the paradox inherent in any language. For me, your defence only confirmed and doubly reinforced the inescapable feeling that I’m being talked about: don’t worry folks, this guy is not to be taken seriously. Nothing more than exhibit A in a star chamber interrogation. For our amusement.

    Such perceived (no doubt misconstrued) cattle dealing evokes rising irritation, that is then understandably reflected in the victim’s responses. Which are then predictably and roundly condemned as pathetic “emotional hyperbole”. “We are not doctrinaire sharks!” How dare you! The collective “we” is a dead give-away. “Our integrity” is being challenged! The unmistakable, institutionalised closing of ranks, rallying to the flag. The breach must be stopped at all cost.

    BTW. Your website does not work very well. I’m afraid you confused your readers by inserting both “by SciSal” and “by Theo Wit” under the title.

    The title itself was promptly called out. On the grounds that the Philosophy of Language is a recognised discipline, where the uninitiated have no right to intrude. Subtext: I’m not permitted to discuss the Second World War, because that topic has already been thoroughly done to death by experts.

    My avatar was flippantly dismissed as, “obviously pretending to be Daniel Dennet”. [A man I hope never to meet in a dark alley.] Sal Scilicet was deemed a play on Scientia Salon. How paranoid is that?

    I should have made it clear that I was expressing a  layman’s opinion, not laying down the law. In Europe that would be taken for granted. But in the US, public discourse has become hysterically polarised to the point of gridlock. To express doubt that our contribution to inevitable climate change is not likely to make much difference is seamlessly conflated with “Holocaust denial”. I speak Hebrew. That means I’m a rabid Zionist. [That I’m not Jewish is, I agree, completely irrelevant.]

    I should have cited chapter and verse of all the relevant literature. In Europe, students are encouraged to think for yourself. Original ideas are more readily appreciated, not automatically dismissed as obviously  plagiarised incompetence.

    Every time I want to post a comment, as you urged me to do, I must log out and log in again. Then I’m still often told, “sorry, you must be logged in to comment”. Then I have to back out, shut down, and start again.

    The emails, advising me of new comments, do not work. When I click on the associated link, I’m invariably directed to your website, not to the specific comment. Then, after I have laboriously found that comment, there is often no ‘Reply’ button. This becomes so frustrating that paranoia is just around the corner: have I been black-listed? Indeed, other commenters have said they could not reply to mine.

    My interest is textual analysis. Ergo, my reading of what transpired is based on intense scrutiny of every nuance in the comments. This, I think I was arguing, is precisely where most misinterpretations occur. Ordinary daily discourse is never subjected to such close scrutiny. For the sake of going along to get along, we are more than willing to make allowances for awkward expressions. ‘Pregnant pauses’ are quickly patched with inane pleasantries.

    We are more than willing, among friends, to give each other the benefit of the doubt. Which inevitably gives rise to the ubiquitous unwritten conspiratorial agreement: consensus. My word, but we understand each other very well, don’t we. More tea?

    Theo Wit.

    Like

  4. August 12, 2014. 10:09:51 pm

    Theo,

    well, I’m definitely sorry you feel this way, and if you wish to publish this as a final comment, please do. But no, there was no set up on my part, and my readers reacted the way they reacted for a variety of reasons to which I’m not privy. 

    Some of what you wrote was indeed egregious from a point of view of philosophy of language, and they called you on it. I did filter the most aggressive comments, but after a while it was clear that people were beginning to be upset with me as the editor of the site for having published the piece to begin with!

    As for the technical side, I checked, and there is no “SciSal” before your name on the pieces. I don’t know why you had trouble with the comments, the thing works fine on my side. As for the “reply” button, there are two levels of responses, once someone responds to another comment than all other responses are treated as further responses to the original one. Believe me, it’s the least confused way to organize complex threads.

    cheers,
    Massimo

    August 13, 2014. 11:16:17 am.

    Thanks Massimo,

    I am attaching two screen shots of the emails advising me that my essay had been published.

    Under the title is “by SciSal”.

    This is not reflected on your website today.
    But it was there when you first published.

    Cheers,
    Ted.

    August 13, 2014. 11:34:31 am.

    That’s odd. There is a setting for not having the double authorship come out, but I switched it several weeks ago. At any rate, the problem doesn’t seem to appear now.

    Cheers,
    Massimo

    August 13, 2014. 11:43:16 am.

    You see. There’s my problem right there.

    No apology. Just, “that’s odd”.
    No acknowledgement that now you see why commenters were appealing to you. “Massimo, is that you?”

    No acknowledgement that now you see how that might induce one such as I to wonder whether this is some kind of set up.

    No mea culpa. Instead of that, you admit that, on the basis of what your regulars have said, with some venom, you now realise that perhaps it was a mistake to publish my work.

    Where does that leave me?

    You took me on. Now you hang me out to dry.
    Where’s the academic ethics in all of this?

    Ted.

    August 13, 2014. 11:48:31 am.

    What on earth are you talking about? As I said, that problem was fixed on the site weeks before your essay was published. I don’t know what email alerts look like, since I don’t get them.

    I have nothing to apologize for. I published your essay in good faith, and I am not in a habit of defending my guest writers from my readers, except for the occasional filtering of really poorly phrased emails.

    Sorry this was such a bad experience for you, but it don’t see what I’ve done wrong or could have done otherwise.

    -M

    Like

  5. Sal Sci (Aug.11th) It was my pleasure.

    On my reading, the key to PofL I&II you put in Part I, the second para.

    “I would like to intervene in the many feisty, on-going and by-now tediously predictable semantic jousts in this-here genteel forum, in particular between Massimo and some of his regular readers..”

    From there on it only *inferred* much sense, admittedly in florid, axe-grinding, abrasive, excessive and self-contradictory style -and probably because of that was considered post-worthy.

    My brief version on the subject:
    Humans are just one of a host of species to have evolved intelligible communication, -by which I mean a capacity to transfer useful knowledge from brain to brain by sensory exchanges (by signs, sounds, expressions, body language, etc.).
    Our ‘Language’ ability is, of all, outstandingly accurate and precise, especially in its many permanent external recorded forms; but it is, by its very complexity, wide open to confusion and misuse, accidental or deliberate.
    Attractive, seductive ‘Falsity’ abounds. Yet hard-won ‘Truth’ can never be more than, from highly probable beliefs to highly doubtful ones, always founded on pragmatic induction arrived at from sensory input. (Yes, I include 1+1=2 and the speed of light for instance!)
    Private ‘meaning’ each of us idiosyncratically deduces using previous experiential data recalled from a unique memory which is comprised of two parts:
    (1) personal ‘hard-wired’ inherited data evolved from our ancestors experiences (this includes reflexes, instincts, emotions, intuitions, feelings–)
    (2) personal ‘learned’ data (including our degree of language fluency)
    and then illuminated by a unique imagination.

    Much less inflammatory …isn’t it?

    Like

  6. I have allowed the publication of these last two comments, which include private emails I sent Theo, but I find it peculiar that he wanted them published, and even more so that he did not ask permission to do so. Still, I stand by what I said: I was not “taken in” by anything, and I certainly didn’t “duck” a thing either.

    I evaluate each submission to Scientia Salon on what I think are its merits, within my limitations in terms of expertise, particularly in fields far from my own specialties of biology and philosophy of science. But it has always been my policy that once published, the authors are on their own in terms of whether, how, and how much they want to engage my readers. I occasionally chime in on discuss threads that are not about my own writings, but this is a part time, entirely free, enterprise, and I do have a daily job (fortunately, a pretty rewarding one, at that!).

    I still think that Theo’s essay was worth publishing (even though I seriously disagree with his main thesis), and that the discussion – though at time testy – was also enlightening in a number of ways, at the least for me. Cheers.

    Like

  7. Theo,

    I enjoyed your posts.

    According to my reading and in the context of many of the comments, here are some of my thoughts:

    If I say “I would like a glass of water” I assume things like “I want it as soon as reasonably possible”, I want it to be anywhere from 75 to 95% full, and I’m expecting the glass will be about 250 to 400 milliliters. All of those values are also interdependent to a certain degree, e.g. I expect a smaller glass to be fuller than a larger glass. So the meaning of “I would like a glass of water” is imprecise. This is a fully acceptable, an inherent facet of language, and as mentioned can be a ‘trivial’ observation. As long as the situation is relatively strait forward and our contextual referents, from person to person, are reasonably the same, we navigate meaning with ease and communicate efficiently.

    But as we move away from more simple statements like “my cat is black” towards more complex statements (like “consciousness is a function of the brain” or “the regimes that are part of the axis of evil are to blame”) the variability of meaning for each word and for the overall statement start to be factors that can significantly contribute to various levels of misunderstanding and conflictual exchange (from simple disagreements to all out war).

    In other words, thorough, and clear, argumentation and definitions can become very important depending on the work we expect language to perform. Whereas saying “The cat that I consider to be legally and emotionally mine (but at the same time do not consider my property) has fur that can be said to be mostly the color html #000102 when viewed under a light source that has a color temperature of around 6000 degrees” does not help us communicate effectively considering the task at hand is simply to convey a general idea of the color of my cat —higher levels of precision is in general not overdoing it when speaking of things like metaphysics or conflictual relationships.

    Margins of errors add up. The simplifications and shortcuts we use in day to day conversation can lead to serious misdirection when applied to more argumentative or more complex situations. For example, in the exposition and discussion of the factors involved, and reasons behind, a particular armed conflict, when the simplification or pruning of facts take center stage (and there are relatively large differences between individuals’ experience with the issues involved) the probability of erroneous beliefs and conclusions or actions that escalate, or at best perpetuate, the conflict goes up.

    In short I think you touched upon a lot of important issues, and I often feel that when there appears to be a strong drive towards over simplifications, exaggerations, lack of concern or dismissal, insistence on a certain kind of treatment, or denial that there is even something to talk about in the first place, it can be an indicator that the issues involved, for lack of better words, may be part of or bordering on a ‘cultural blind spot’ and so merit our full attention, and their unraveling could be particularly productive.

    So I think we can only profit from paying more attention to what I understand to be some of your concerns and I’m glad Massimo accepted your contribution.

    Like

Comments are closed.