Habilitated-Beliefs #2

One usually becomes a member of a group which is empowered to give expression to an idea in a quasi-official manner through happenstance, not by making carefully considered deliberate choices. By stating or declaring that one is for example a physicist does not entitle one to declare what statements about the world, particularly what explanations of its phenomena, are to be taken as authentic!

The emphasis is on “authentic”, which requires the explicit consent of others. It is often difficult to attain. There are exceptions, as when someone converts from one religious group to another, which involves an overt switch between highly differentiated social groups — each of which may hold strong and competing views about about specific areas of a common world.

At present the health field is a good example of this. Here disagreements about the curative effects of herbs and pharmaceuticals have become quite strident, yet the data on its belief-base are not. At least, so it seems. An equally contentious area is the role and the effects of therapeutic procedures on the so-called mental health of individuals. The old debates about the effectiveness of psycho-therapeutic procedures versus those that solely involve “physical interventions” continues.

How are such conflicts to be resolved? There is an end point, namely, whether a patient has improved in accordance with agreed standards. A similar issue concerns the truth about global warming, what to do about such trends, and what are its likely long-term effects. There have been major disagreements within the ranks of scientific pundits about what conclusions can be reached on the basis of current evidence (and methods of inquiry) and what would tilt the case in one direction or another. These issues are not resolved by a popular vote, by employing questionable and contentious polling procedures (such as opinion-polls based on a representative sample of all graduates holding a first level science-degree!). Issues need first to be resolved at the most basic level within the previously mentioned empowered group, the existing group of acknowledged experts in the field.

This is where the concept of *habilitation* becomes useful. Habilitation specifically refers to a process of generating agreement among people who accept each other as qualified to express opinions about the status of specific issues. It does not mean that this is sufficient to elevate any conclusion reached as being indisputable, but it explicitly states that conclusions are the best at the present time. It may be necessary to state under what conditions such conclusion could be challenged or reversed.

In short, any issue or matter habilitated can also be challenged in future, and therefore it could ultimately be de-habilitated. This requires several steps. What does not follow is that whenever a position is de-habilitated it also automatically returns all former positions from whence these came, or that such a change in status automatically re-adjusts other positions. There is no predictable radiating ripple effect, but there usually follows a series of adjustments in several cognate areas whereby all matters are viewable as part of a compatible world: incompatibilities need to be fine-tuned! Compatibility furthermore means that in the final analysis contradictions have been eliminated!

Restricted Definitions

Generally speaking, we refer to definitions when we wish to assign a particular, unique meaning or interpretation to a word. However, words appear in sentences and their meaning is conveyed in three ways: (1) by the definition, as given in one of the many dictionaries available to us; definitions usually vary from dictionary to dictionary; (2) by the specified context in which the target word appears; (3) by the general context when it is used, the inarticulated situation.

The difference between (2) and (3) is straightforward: take for example the word *animal*. If one is discussing animals in a zoo — a specific context — most creatures not on display in cages or enclosures are excluded. Here the term *animal* has a restricted meaning so that insects, vermin, fish would be excluded. If one refers to snakes in an aquarium it is safe to assume — but not certain — that talk is about water-snakes. A general context (3) refers to no specific habitat but a context is assumed, so that one could be talking about land or water snakes.

This blog contrasts two kinds of definitions: those which equate a term with its definiens, and those which propose areas of use.

A definition is not a description in the normal sense, but clarifies how a word — even a stream of related words — are used in everyday contexts. However, some dictionaries will also include definitions of terms which are not the ones currently circulating but add a note that this use of the word is archaic and rarely used. In doing so, the editor(s) acknowledge that uses change over time and, furthermore, that a word’s relationship to other words and the ideas these represent changes. It appears to be a very fluid situation: I propose to refer to this fluidity as “porousness” and therefore speak about porous definitions.

In special areas of interest — like Physics — some terms originate in ordinary language (*force* or *attraction*) whereas others were created to put a name to a discovered phenomenon. Contemporary physics has many examples, like *neutrino*, which Wikipedia states is, “A neutrino, a lepton, an elementary particle with half-integer spin, that interacts via only the weak subatomic force and gravity. The mass of the neutrino is tiny compared to other subatomic particles.”

This not the usual definition found in a dictionary but represents a mixture of a definition and a clarification. It is not at all porous, but firm as steel. There is nothing comparable in our — or any other — language: indeed, the definition for “neutrino” is valid for all languages! It is the prototype of a restricted definition. I cannot imagine how a restricted definition can be used in any other but its literal sense, where the definiens exhausts the meaning of the target word. Thus a sentences which contains the phrase “neutrino-like” would be an attempt only to expand the restricted definition — to break the bond — but it is difficult to see what this would mean, what feature of a neutrino as originally defined would be extracted and assigned to a different (new) phenomenon.

It is therefore useful to distinguish between terms which can — or are — given definitions that are meant to be used in specific cases only — which are deliberatively contrived and therefore have limited applications and are restricted- and definitions which are not steadfast in meaning but are what I describe as porous. Included would be words like *bridge*, which can refer to a structure built across a chasm, but which also means to step over two issues, may refer to the structure of a nose, refers to a reconciliation between two fractious parties and also is the name of a card-game which allows a partnership between two of four players.

Restricted definitions are widely used in technical fields and by scientists when they refer to their own domain of study. These groups develop an in-house set of terms and expressions which are often incomprehensible to outsiders. This barrier to understanding is not necessarily planned. People learn early that they operate in a multifaceted society where expressions are not only descriptive of a situation but often reflect the mood of a group. If one is not a member of such a group the discussion may pass without ruffling any feathers.

Words may have colour and get chosen to bias a scene. When this occurs, *generally used* is a reference to social customs which are — by definition — constricted to a group; it therefore is a social statistic.

This could be stated in terms of what people of a particular social group usually do with words and sentences in a given, specifiable, set of circumstances. A person who addresses others by “peace be with you” does not necessarily mean what he/she says: they may, in truth, be wishing you dead! In short, words do not necessarily mean what they say.

In live situations one puts two and two together: listens to the words, identifies them, observes how these are being used in a context, and uses other clues to interpret what the speaker really meant. If the other person is “flashing their eyes” and also reaches for their pistol, one is likely to accept the greeting “peace be with you” as displaying an aggressive, hostile greeting: one takes appropriate precautions.

Definitions as listed in a dictionary therefore need to be viewed with skepticism. The listed definition is a guide, not a legislative act. It indicates the possible use of a term, perhaps even a widespread use, but not necessarily the only, exclusive one.

There are many exceptions to this rule: specifically definitions of terms used in a particular branch of science, technology or in a professional sphere, like Canon or Criminal Law, which may be intended for use in an idiosyncratic limited manner and which occur by common agreement of those using it. Inevitably this use is meant to be exclusive, singular, and therefore is often incorrectly employed by outsiders but also by insiders who should know better!

Words, and expressions are produced by men, women and children in specific circumstances, most often willfully, with intent, not haphazardly. Few people adopt the view that a word must mean what they want it to mean. They follow custom (although poets have license to break customs) and when in doubt look up the word or expression in a reputable dictionary or Thesaurus.

There are exceptions primarily by those totally new to a language. Words, we say, have meaning. These are sounds which refer or point to events that may have nothing to do with the quality of sounds uttered. Thus words and a sentences constructed from words may be viewed as codes to inform others about states of affairs, and this may apply to the state of the individual or to impersonal, external, events.

We furthermore identify whether the information may be trivial, like “you have just stepped on a beetle,” or life-threatening like ”You have just stepped on a boa constrictor.” The expressions or sentences get part of their meaning from the circumstances under which these are uttered, although the circumstances are not part of the definition of the terms. Thus the meaning of a word depends on several circumstances so that a definition — as it appears in a reputable dictionary — should be viewed as a declaration about how the word could, or may be used; it is not prescriptive, only suggestive — and discretionary.

Finally, let me comment on the difference between (1) a restricted and porous definition and (2) the notion of a rigid designation, an idea we owe to the contemporary philosopher Saul Kripke (b. 1940) (see his Naming and Necessity, Harvard University, 1980). The idea of rigid designation has been discussed by several philosophers/logicians in discussions about “possible worlds”, not worlds as claimed to exist. The basic assumption generally made throughout history is that there is one real world and that we gain access to it by following strict procedural rules. This assumption found its most ardent expression in the work of Aristotle who argued that empirical research will reveal the “nature of things,” namely those features of something — an object or event — which were essential to it and those which are only added qualities.

So an object had two sets of features or qualities: those which were indigenous to it, its essences — which cannot be stripped from it without depriving its of its identity — and those that are ornamental,or contingent. As investigators we are therefore assigned two tasks: to identify the essence of each species of things and thereafter to classify to which broader category it belonged.

The Aristotelean approach and its method of searching for knowledge, although strongly and traditionally supported by Christian scholars, was forced into a compromise during the 16th century by technological discoveries which suggested that earlier procedures were not only capable of improving observational methods but could result in discoveries which were incompatible with the picture of the universe developed by previous generations. It could produce a “paradigm shifts” as described by T. Kuhn in the 1960s.

The most stunning example of this took place 30 years earlier with Einstein’s relativity theory, and before that when Darwin postulated that life forms on our planet had developed over several million years in an orderly manner according to some rules many of which had yet to be discovered. Structuralization was not a firm rule, as postulated by Aristotle, but was something which happened according to rules yet to be discovered, not a force acting upon nature but part of nature itself.

The end of the 19th century and the first half of the 20th century hammered home the idea we have to relinquish the old idea that *discovery* referred to unearthing treasures, like broken urns in the desert sand lying hidden somewhere below the surface, and to replace this with the notion that structure itself comes about, emerges and develops in a seemingly endless series, a process. And the orientation this discovery produced was the idea that there was not only one world, but that there are many, each being viewed as the best conjecture of a possible world. Thus a particular term could feature in several possible worlds, but in each case it involved a shift in its meaning.

There was, therefore, a place for rigid definitions of terms, provided these were confined to one possible world. Once defined such words could not transfer to another world without infringing its earlier rule of use.

Statements, Meaning, and Analogies

Statement A: The river has crested.

Statement B: The river has reached its highest point.

river-combBoth statements describe the condition of a river. The two statements appear to say the same thing, and therefore — it is claimed — have the same meaning. Even if it is not true that these are equivalent — in statement A, the river may yet run higher, contrary to what is said in statement B — it may be claimed that A is equivalent to B, and vice versa. It is a question of meaning.

In other words, it may be claimed that statements A and B refer to the same situation. As philosophers and commentators, are we required to resolve this issue,? Do we need to ask “what is the difference between reference and meaning”?

I would argue that statements A and B converge in meaning, but also emphasize that this does not make them necessarily equivalent

First, the meaning of each statement depends on the context of its use. Consequently if the context changes, so does the meaning of each statement.

Second, the two statements, when enountered in certain contexts, have a hierarchical relationship to each another. One is more abstract, inclusive, than the other: it is something which has to be evaluated.

In the second case, statement B is an empirical statement, so that the data is determined in a different sense than in statement A. Both meanings are data-determined but in different ways. Each says something different. Statement B is related to data is a more “fundamental” way than statement A, which is mostly analogical. This needs to be clarified (see below).

When I refer to data, I suggest of course that regardless of how things were described, we would most likely agree that there is a phenomenon we are all aware of which is what is being talked about, and that this phenomenon can be identified and mensurated in a manner which allows us to determine properties reflected in both A and B.

To say “the river has crested” is clearly analogical. *Crested* derives from French and refers to (among other things) the comb of a rooster. It refers to form rather than to a (linear) measure and only indirectly refers to how this can be measured. By contrast, B is a reference to some previously adopted perception of scale against which it is being compared. In this sense, statement B is more fundamental. One could also argue that statement A is more abstract than B, that to refer to the river has crested in not only analogical but also represents a more general case and also is more difficult to gainsay.

Conclusion: an analogy represents an attempt to state a current experience (event) as a case of a more general event. In this sense analogies are theoretical. Thus, to assert that A is like B — the form of an analogy — is to construct a theoretical proposition whose falsification becomes increasingly more difficult the more analogical it is.

Analogies like statement A represent our efforts to increasing the stability of our world in the face of experiential diversity and in that sense, analogies have signifiant heuristic value: they reduce variability, change, diversity. One can question the value of a particular analogy, but this in itself in not a test of an implied theory.

In retrospect, the history of human knowledge suggests that we forever seek analogies in the hope that the one chosen is more appropriate than earlier ones, and that the one chosen now will assist us better in unifying the increasing diversity of our experiences of our world.

Philosophers as Shamans

This is the first of several short pieces on the relation between shamans and philosophers. Each piece deals with an aspect of the theme of how faith—healers differ from philosophers, how metaphysical thoughts may influence how we perceive our world — that is, what we actually claim to have seen, whether through rational persuasion by others — by changing the framework of perception and coincidentally changing our sense of personal well—being and of the body itself. Shamans — male and female — are precursors of priests and organized religion, but more often are viewed opponents than friends.

How to distinguish Philosophers from Shamans

This strange and challenging title joins two groups: philosophers and shamans. In the popular mind these seem to have very little in common.

wittgensteinBoth can be soothsayers, but philosophers traditionally deal in wisdom whereas shamans with well-being, the esoteric, the health of body and mind. The general view is — I think — that philosophers are educated, wise, cool in judgement, perhaps a little eccentric and monistic, inclined to press and urge their “take on things”, are often anti-establishmentarians because they challenge some features of what is perceived by others as common-sense. After all, the ordinary person does not walk around and claim that “time is not real” as some philosophers have done, or suggest that all things are made from the same unseeable substance, from atoms, which by definition cannot be “sensed”. These views varies from what is normally accepted. Philosophers however have negotiated with others a license to raise issues about obvious matters and issues and are often permitted by their fellow citizens to utter the unutterable, and speak whereof the ordinary person must remain silent. This is generally speaking a benevolent view of philosophers : it views them as gadflies, but not as vicious rampaging mosquitoes.

Lest we forget, most societies in the past have not been tolerant towards philosophers, but have been more hospitable to shamans and faith-healers. Their claims seem to much more plausible. Shamans are more rare in moderns society than they once were. One see an occasional sign on the window of a private home advertising their services. We may also hear about them in Anthropology 101, a subject fewer students study now than in my day — relatively speaking — but the term may also occur en passant in Sociology 101, a subject which attracts an increasing number of students in colleges and universities.

But shamans, as I will argue, are the original or ur-physicians, and emerged as a force within tribal societies before medicine was hijacked as a “profession” and became increasingly devoted to “physical medicine” and less and less with the spiritual well-being of individuals.

The shaman in early society was a standard figure in larger human groups and his/her role was to achieve relief from pain and discomfort for others by their specialized knowledge of “nature”, plants, animals and what we would call “natural resources”, and also by their presumed access to the world of spirits, either benevolent spirits, indifferent spirits or evil ones, that is, demonic and covetous, spirits.

Like humans, spirits responded to others, could be helpful or mischievous, even downright evil. They have to be contacted and approached before one could negotiated, even master, them. This was the unique job of the shaman, a job perhaps inherited from the father or mother and required a life-style different from those of tribal confreres. Indeed, in many early societies the role of supreme ruler (king) and shaman were combined — a potentially hazardous combination because when kings fail they face revolt and execution. Kings accepted their responsibilities for the future of their “tribe”; shamans do not.

Generally speaking, shamans also do not guarantee success in restoring the health and comfort to their “patients”. They could not be sued for malpractice! Failure of their mission could be attributed to extenuating circumstances, never to the shaman as an inadequate practitioner of their esoteric art. One price paid for this exemption of responsibility was that they were often viewed as outsiders and some modern commentators have suggested that shamans had symptoms of schizophrenia, which could easily isolate them from the rest of their tribal fellows.

I do not want to give the impression that there is a smooth continuity between the shamans of old, as they operated within early societies — in some cases many thousands years ago — and contemporary medical practitioners in countries like the USA, Canada, France, Germany or Scandinavia who have their roots in a series of critical developments during the 19th century, in the work of Semmelweis, Pasteur, Charcot and Freud, Lister, Ehrlich — and in retrospect above all in Darwin and modern genetics, those medical practitioners who focussed on the role of biological mechanisms which underly life itself. Far from it. But this applies equally to other disciplines, especially to physics which underwent several “scientific revolutions” (in Kuhn’s (1962) sense) during the 20th century and who are almost as far away from the speculations of Democritus (400 BCE) and Aristotle (c. 330 BCE) as the shamans (c. 10,000-5000 BCE) are from moderns physical medicine and psychiatry. Continuity is not linearity and does not imply a smooth uninterrupted development from early roots to the present, but refers to similarity of problems, not of solutions.

Thus, people have become speechless since we first adopted speech as our primary method of communication, but our understanding of what produces periods of speechlessness in our lives has radically changed as a results of systematic research done on this problem over the last 100 years.

Shamans assumed that our states of awareness (consciousness) reflected something about the natural order of the world they lived in: that spiritswere “of the essence” of things and that material bodies were “abstractions” in the sense that a “mountain” — or a “tree” — were manifestations or representations of a reality greater, or above, what was physically experienced. It is a view which is still with us — and which underlies much of what goes under the name of “philosophical idealism”.

What shamans in general assumed was that ubiquitous spirits, which were part of everything perceived, could be influenced even controlled by “special methods” which were unique to them. These methods included secret concoctions and brews as well as idiosyncratic methods of solicitations (like incantations and gestures, including dances). I am reminded of scientists in the mid-20th century who claimed that the true path to Truth is via method — itself a half-truth.

The shaman furthermore believed — as did their followers — that practices of divination not only propitiated spirits, but were instrumental in healing others through the mediation of other, to most people alien but powerful spirits. Healing for purpose of our discussion is viewed as “restoration” of spirit, for even boils and bodily wounds were often viewed as manifestations of spiritual. In general, it means that without a universal belief in different levels of reality — and a firm conviction that the world of spirits is primary — neither shamanism or religion would take root: chimpanzees, our close biological kin, are not known to divide their experiential world into a spiritual and a physical realm. It is a human proclivity to assume this.

We do know this: that our earliest text about what humans believed about their world, as recounted in Hesiod’s (c.650 BC) Theogony — only 300 years ago — reveals that our ancestors viewed their world as inhabited by gods which were not different in most respects to themselves, except that these were more powerful, more corrupt, more vengeful and conniving than we allow ourselves to be.

Our ancestors faced two sets of issues: unwellness due to spirits and un-wellness due to physical injury. The former could be “cured” by spiritual means, the latter could be alleviated. A woman bitten by a poisonous snake may be helped by ointments made from special (secret) plants, but a man in violent and uncontrollable temper or mood — overcome, as we say, by emotions — could more likely be helped by spiritual methods (although smoking pot or certain libations and brews, may also alleviate his distress). An arm lost to a lion could not be restored, but the suffering which follows could be abated by appropriate spiritual intervention.

Spiritual intervention means healing an afflicted person by restoring their sense of wholeness and well-being. Struck by lameness, loss of hearing, inability to balance when walking, loss of speech are all examples of such afflictions — and are prototypical of ailments from which a person can recover and have been known to recover fully or in part. Each of these functions can be restored through the help of a third party: the faith-healer, the miracle worker, perhaps with the help of a friendly exorcist, or the support by your social worker or spiritual advisor-priest.

All these “moderns” — whether dressed in white lab coats or in ceremonial and religious garments — fulfill the role and functions of earlier shamans. They often mediate the “cures” promised, whether for long periods or only temporarily — and they always have explanations why in some (many?) cases the treatment was “only temporary” or on occasions even unsuccessful. Not all the lame at a revivalist meeting raise themselves and walk as their preacher exhorts them to when they cry “Heal in the name of xxxx”!! Some do — and persuade the rest of their large fellow brethren, of the efficacy of the divinely inspired message of the preacher.

All treatments discussed above require an intermediary — a male or female “shaman”. All require that the afflicted person or persons believes in the efficacy of the treatment. Both parties to the cure share a theory to explain to themselves and to others why the treatment should work — and why it often does not. I shall address this problem in a follow up blog but remind the reader to keep in mind than a prothesis is not a shamanic device.

Tail-note: Philosophers — but also modern psychologists — have a special interest in the “nature of belief” but the former try to remove themselves as far possible from acts which can be construed as therapeutic in intent. Wittgenstein was not a therapist. Do they succeed? Only partially — which is one of the reasons for writing this and future blogs. Philosophers often behave as if they have a therapeutic objective, as if by rewriting one’s script one can also re-structure one’s life. Perhaps there is some truth in this.

Facts and the Web of Conceit

Part 1 — Corroboration of Beliefs

When people talk about themselves — about what they saw, how they felt about everyday things, even how they discuss unusual happenings — their utterances could often have been expressed more succinctly, more accurately, with more of a twinkle in their eyes or fervour in the voice. I myself invariably look for “better” ways of describing things, especially personal impressions and experiences. I am acutely aware of the difference between how I think about past experiences and how I report these to others. There is a conflict between the best words which give the most accurate description and to tailor the suit to the wearer.

Sometimes a matter of making things interesting rather than boring. One may take a measure of the other person, of their sensibilities and their sophistication. But one also needs to draw a line,to reach conclusions readily, correctly, avoid dilly-dallying, tell it as it comes even when it is not how things originally appeared, by a good preacher, not a yawning lecturer. Such doubts mostly concern personal experiences and do not apply to reporting “objective” matters.

Thus when I tell you that I flew from Toronto to London on the 8th September, 2011 this report that can be independently supported, that is, corroborated by independent “others”. Corroboration by others supports any beliefs I express on my own behalf.*

Belief, we can conclude, is either something we confirm to be “true” because we trust our personal experience (even when we shouldn’t!) or because we trust others regardless of whether we trust ourselves in a matter. (Was I awake or dozing when I thought I saw a mouse climb up a clock?)

Belief is therefore not a property or quality of an event, but refers to a personal, subjective state to which we have assigned transcendental qualities, namely a quality which cannot be independently verified. We do so on the assumption that it is an act of affirmation which others cannot deny us — even dare not deny us!

When used in the plural, e.g. (a) my beliefs, suggesting that I have several beliefs which together form an entity, often referred as a set of beliefs or that (b) our beliefs, that is a set of individual beliefs each of which is also part of each person’s set of beliefs, there is a problem.

Take each in turn: (a) my beliefs suggests that I am the owner of several belief-items, that each item is, or may be independent of others and that these have no common origin. It is like having individual pearls in my jewellery box, not strung together into a necklace or fashioned into an ornament. Furthermore some pearls may have a different hue than others, or differ in size, features not mentioned or discussed by my statement.

828002.TIFBut beliefs, like pearls, differ in brightness, size, and other properties and some beliefs are more intensely believed than others. Also different beliefs held may be coherent, that is, constitute a web of beliefs and therefore have a demonstrable structure which relate them to each other or which like pearls lie separately yet waiting to be arranged into one or more pieces of jewellery.

It seems more often that the latter describes things best, even though we fantasize otherwise on the assumption that beliefs should form a “configuration”. We may occasionally be able to demonstrate that some belief-items cohere though the wish here may be father to the claim.

Consider the term (b) our beliefs (which I treat as one word referring to one item). This may refer to one of two notions: first, that each person has a set of (independent) foundational ideas, which is referred to as “our beliefs” since it refers to an individual person’s set of individual beliefs. Yet my set of beliefs and yours may differ, so that it it is appropriate to refer to these in the plural form, as our beliefs, your set and mine.

Note, that belief-items may be shared, but that each set is different — that individual belief-item is not necessarily shared (common) by all members of the group of people. It may result in the group being separated into sub-groups, as are those members of a church who hold that only men should be priests, or who hold that only unmarried men should be admitted to priesthood — currently a debating point in several religious communities.

Most often we agree to sort beliefs into those we hold in common with others to form a belief-cluster (or dogma?) which unites us into a group and which therefore leave beliefs which lies outside this shared area as “idiosyncrasies”. It often represents a first step towards open-mindedness. United we stand, cry some! Let anthropologists and sociologists characterize us as best they can.

If we ask, “Can a list of ideas everyone shares with others be given a basket-label?” we come up with two answers, not one! There often is a class of foundational ideas but there is often also a basket of transcendental ideas. Foundational ideas are those which refer to experiences which cannot be destroyed by doubt, ideas which are skeptic-proof.

Descartes (c. 1630) comes to mind as the best known advocate of this position, but also Locke (c. 1690) who claimed that all ideas were experience-based and mediated initially through sense-perception: the slate for Locke was clean but also ready to receive messages, unmistakeable imprints.†

Accordingly, foundational ideas refer to those first impressed (imprinted?) on us and which therefore cannot be eradicated or revised, but only be built upon. Some writers have argued that first-impressed does not mean that what subsequently follows is inevitable since there is no rule which states that such material has subsequently to be processed in a particular manner. Such a claim would be like claiming that given flour, water and a few basic ingredients one can only create one type of bread and no cakes. Tell that to a lover of apple strudel. One may say that the structure of the web is not determined by what the web is composed of — as any graphic artist knows. The relationships between materials used to make a painting is only one of many possible relations.

It is widely held that transcendental ideas are those we use to order our experiences. According to Kant (c. 1800) ordering experience requires concepts of “time” and “space” yet these are entities which are independent of the experience that is being ordered, as are the logical categories which are part of the ordering process. The existence of such ordering ideas have always been presumed as givens (Plato wrote extensively about these).

In principle it would be possible to give each experience its own unique name. Nobody knows how this could work. I see a sheep now and I could assign it a sheep/to-day’s date/present clock-time tag! It is a problem in coding and in handling the codes and book-keeping. The next sheep I encounter would carry the name sheep/current date/present clock-time tag. No person I know would be comfortable in such a world, but would invent short cuts — but machines could function quite well in such a world. In other words, it is possible to conceive and to build devices which operate according to such sets of rules.

We don’t know — before plans are actually laid out — how effective and functional such a device would be, or what it would have to look like. At first glance it may work quite well: known as a robot it would operate according to emerging principles of robotics. I don’t know whether robots have to “feel comfortable”, but I suspect not. Robots do not have beliefs in our earlier sense. Vive la difference. Without beliefs they may not go to war!

A robot then would operate according to emerging principles in robotics. Nevertheless robots could change their actions and adapt to their environment — including their self-made environments — but the operating principles need to be worked out in advance — by us, we who are their originators or creators.

*Footnote: I have written a separate blog on corroboration which will be published soon.

†Footnote 1: There is a modern school of thought which reversed this position and holds that all ideas are influenced by the existence of other ideas concurrently in circulation. Contextualism has its roots in Charles Pierce’s philosophy of pragmatism and its most notable advocate in mid-20th century was W. Quine (c. 1950-1990).
Footnote 2: Contextualism was inherently unfriendly to logical empiricism, as publicized earlier by A.J. Ayers (1936) in his widely influential book
Language, Truth and Logic, a book which echoed many of the positions first espoused by the early L. Wittgenstein and subsequently by R. Carnap and members of the Vienna Circle and after the Anschluss in 1938 by readers of the journal Erkentniss and its USA successors).

Users of Language: PITS and PAW

In this blog on the use of language I introduce two terms which may be useful in discussing a problem which was created for us in the last century and which continues to haunt me: namely, what is the relation between our language and our increasing mastery of the world we inhabit. By the latter I mean what we refer to as our scientific knowledge and our ability to adapt much of this knowledge to everyday use, i.e., technology.

The plumber joins metal pipes — it is his skill — but he did not invent the materials he uses or the methods for achieving his objective which is to weld two pipes without these springing a leak when fluids under high pressure pass through the pipes. We are all members of a community as well as members of different specialist groups who tend to speak their own patois, or at least use their own specialist terms and phrases.

Word cloud from an article on neuroplasticity published in the New York Times

Of course, as members of the community we share common problems and goals and we learn to speak about these with others without difficulties but without using specialist terms or expressions. Misunderstandings can be clarified, although agreements cannot be guaranteed. This makes us “persons in the street.” The phrase is old and well used, although it also appears as “the man in the street.” I’m not sure of its origins, but I heard it frequently during my graduate-student days when I attended meetings of philosophical societies and also participated in many lunchtime conversations over a beer in pubs in London. “The man in the street” referred to all of us in that we were “ordinary” — people stripped of our professional or worker’s outfits, “citizens” and “family members,” but not as artists, academics, public servants, office workers or street-pedlars, the kind of people who would show up at lunch-time in and around Charlotte Street and Soho, London.

The pubs on Hampstead Heath, on the other hand, were much more “exclusive” and over weekends became the stamping grounds of an intellectual crowd, not by “persons in the street.” The latter I shall baptize PITS (Persons In The Street). “Persons” is here used generically for all citizens, regardless of their sex, race, affiliation during work hours, their political or religious persuasion. It is an all-inclusive term.

Word cloud from a scientific article reporting research on an aspect of neuroplasticity

The contrast to PITS is ourselves during work hours, when we don our work-caps or professional hats. During this part of our day we tend to converse with others using specialized terms and phrases, sometimes in sentences which defy ordinary grammatical rules.

I suggest the acronym PAW, for “Professionals At Work.” “Professional” here refers to the notion that regardless of what one does during work hours, one adopts words and expressions which may be quite foreign to most PITS.

The majority of person in-the-street learn also to use an in-house language suited to their specific work environment. Thus nurses speak as comfortably in the vocabulary of hospitals as does the medical faculty. Admittedly, the latter — medics — may master additional terms which in turn depends on which speciality they practise.

The language of PITS is fluid, as dictionaries of everyday language like the Webster, demonstrate. Technical terms enter — and depart — at a staggering rate, which is something new to our social experience. It reflects the pace and rhythm of our technology dominated culture, which forces everyone to march to new tunes throughout our ever-longer lifetime. The strain on each individual can be terrific: cardiologists have their work cut out for them.

But it is not only the arrival and departure of new and old terms which troubles me and forces me to read wiki articles more often than anything else daily, but that terms with which we are already familiar change their meaning often throughout our lifetime. This applies to ordinary “in-the-street” language which includes the “received wisdom” from the past (much of it very dated and therefore quite false), but also the wisdom of more recent origin. These are issues I shall address in future blog writings. Please — stay tuned!

Faust’s Wager

Faust’s wager with the Devil was straightforward: Faust declares that “If to the fleeting moment he could say, ‘stay yet awhile, thou are so fair,’ then he would yield his soul to Mephistopheles forever since traces of his earthly being would linger for all eternity.”

Faust, unlike his author Goethe, was a traditional Christian believer: he believed in God, communicated directly with him, believed in the Holy Scriptures and the rituals of the Holy Church; he also believed in the existence and reality of the Devil, Mephistopheles, and in his supernatural powers — enough at least to strike a bargain with him. Faust had indeed studied all the sciences, immersed himself in the wisdom of the past and the venerated ancient writers of Greek antiquity. Despite all that, his learning could not answer many “foundationalist” questions: why life, why colour, why innocence and sin, why beauty and horror, what creates ecstasy and bliss — and why doubt nags one’s inner composure, one’s certainty about one’s beliefs.

The terms of the wager between Faust and Mephisto sound straightforward and can be interpreted in several ways. The bourgeois century, which witnessed the wholesale transfer of power from the landed aristocracies of Europe to a new class of merchant princes and their minions, interpreted Goethe’s Faust as a chronicle of unrepentant (male) ambition and greed mitigated occasionally by feminine compassion. But the two prefaces to the play — rarely performed — tell another story. The second of these is wonderfully cheerful in its depiction of God and the Devil. Both have a great sense of humour and in their debate about whether God created a wholesome, morally defensible universe, strike their own wager about God’s ultimate creation, Man. Is he morally sound, or do his ambitions, his unbridled lusts for power and sensuality destroy the world he was given as his playroom? Man need not “believe” in God, worship Him as a creator and source of wisdom, but his test is whether he will pursue his quest for knowledge and blend this with his sense of compassion. This, as we learn, was God’s plan.

A good man, says der alte Herr — the old gentleman — to his former servant, the Devil, is aware at critical moments of his ultimate obligations towards others and to future generations and their well being. So there are good men — but also evil men; the former are non-seduceable, but others certainly are. Each must be judged by their deeds and intentions — and must be judged by others than the devil who has a glaring self—interest in the outcome of such judgement: the devil is guilty of the same sins which defeat so many men in their strivings: he has an insatiable lust for power and glory, not an urge to achieve worthy goals. He is the paragon of evil.

The debate about what is “worthy”, about what can or even should be be approved by us, is as old as humanity, or as old as there have been creatures whose actions towards one another are not only guided by what used to be called “blind instinct” but by a calculus of desirability. God made instincts, but humans were also given judgemental powers that they could decide between actions and make judgements based on the merits of each case. If modern science draws a portrait of humans as animals which have a choice about which actions to pursue, especially actions which influence the well-being of others, it supports a humanistic philosophy, not a theology. The issue is entirely about the scaling of ends and the evaluation of means seen in the light of desirable ends.

In the story of Faust (as told by Goethe), the Devil loses his wager with God: Faust, now frail, blind and old, yields momentarily to the bliss of the moment as he visualizes his ultimate achievement, the sounds of workers erecting under his direction an Utopian city he had planned for humans. It is not the city of God — but a city which will be governed by men and women exercising good will and compassion, rather than acting through the forces of greed and self-interest. (Goethe wrote before the rise of the press-barons of the 19th and 20th century, or the industrial financial and multi—media barons of the 21th century!)

Salvador Dali: Vieux Faust

During this banter between God and Mephisto the two make a wager whether man — even wise men like Faust — can be seduced by the promise of glory and boundless pleasure. “Easy!” says Mephisto; “Not so,” claims God. Man, God proclaims, is aware of the difference between self-centered pleasure,self-interest, his own sensuality, and selfless dedication to broader goals, even goals which he is unable to articulate at the time of his actions — and which may cost him his life. At the end it is God who wins the wager, not by deviousness — the method favoured by the devil — but because Faust when left to his own devices orders his priorities in a manner which favours the rights and wellbeing of future generations. The city is built for future generations and its purpose is to enable a better life to be lived than experienced by its architect and builder, Faust. It is a vision which is close to godly — and God approves, as does Goethe.

One can speak loudly in the name of the Devil, but only cautiously in the name of one’s god. Goethe appears to proclaim that the godless know evil as well as those who believe in a transcendental being, but that to know evil is not a license to practice it. So Rabbi Hillel (c.30 BC) was right that the only rule required is to act in a manner compatible with how one would expect others to act towards oneself. It is certainly not a rule which applies to the world as a whole, but is limited to the interaction between people. Its status as a guide to conduct stands on its own merits.

On Borrowing and Inventing Words

Many words in constant use are borrowed, stolen, adapted and invented. Of course, one cannot go to court to reclaim a word, unless it is proprietary (pharmaceutical companies do so at a drop of a hat). Has anyone been sued for using the phrase, “I googled the information” or “I hoovered my patio”?

Whatever its historical precedent, borrowing words and expressions from other languages than our own is certainly trendy. It is stimulated largely, but not exclusively, by intense global trade and the phenomenal rise of certain technologically-based enterprises which depend heavily on inventions and discoveries. These have a vested interest in getting their proprietary names circulated as widely as possible. Principle: what is good for consumption is good (profitable) for business.

There seems to be more than commercial advantage for the extensive proliferation of words. Here I examine only three. Each has been named to give a clue to its use: 1) Sanitary Measures; 2) Tagging the New, 3) Insider-Outsider Separations.

1) Sanitary Measures

If one wants to rid one’s home-language of ambiguities, importing words often helps. Ambiguity is viewed as a curse for which a cure is needed. Language has to be sanitized to keep it clean of confusion. It can be done by replacements from an unfamiliar donor-language. On this criterion, Zulu words may be better than imports from Spanish – especially at a time when the latter is becoming the second language in the USA. No one has followed this suggestion – but it could happen. The borrower-language gains a word which can be given a singular, unambiguous “unique” definition in English.

Note that borrowed words need definitions to accompany them on the journey and are translations: the words and expressions used in the definition must themselves be unambiguous. This is difficult to do – and sometimes fails. However, the possibility that a word taken from the donor-language may be ambiguous in its home territory is irrelevant: importing it will strip the word of home-ground ambiguity.

This desirable effect may be temporary and may erode quite quickly in a new environment. Like kids, words don’t stand still for long. The new term will be defined when first introduced, an act designed to freeze the meaning of the word, to isolate and insulate it. Like a band-aid the protection wears off with prolonged exposure. Most users have no vested interest in protecting the word’s “purity” or singularity of meaning, and will allow it to deteriorate. At some point the process of purification may restart – a new word may then be imported: the band-aid is replaced.

No one seems to mind these perturbations, except inveterate conservatives who with advancing age may resist innovations in communication with unseemly vigour. The effort to establish clarity and avoid ambiguity may therefore be quite short-lived before confusion re-asserts itself. On this analysis, language forever changes.

Would a universal language cure this, as some have suggested?

Unlikely, because the need and demand for new words to tag new phenomena will continue unabated. Our efforts to expand knowledge are not based on such noble principles as Francis Bacon (1561-1629) occasionally proclaimed in his prefaces but to an objective he recognized well: to increase the wealth of the “commonwealth” (read: nation-state) through the proper use of the new knowledge, the fruit of empirical research (based on experimental induction!). New words in an old language would surely draw attention to what is new, but it is new within the context of the old.

A whole industry has sprung recently to ride the wave of promoting “innovation” in goods and services and the terms which landmark these. From this perspective a new language would be counter-productive; it would only have temporary advantage. To call a new mouse trap by its traditional name is unlikely to enhance sales and encourage wide adoption. (Nor would its manufacturer be interested in exterminating the species either!) New mouse traps need mice in profusion, a new technical principle or architecture and a new name to ensure its commercial success.

Should one hesitate to advocate the emergence of a universal new language like a newly invented, constructed language on the model of Esperanto? Or a simplified English, as Winston Churchill had proposed to unify the post WWII world? Neither has been successful. Language is a cultural phenomenon: create a new culture – if you know how – and you will get a new language.

2) The Phenomenon of Tagging the New

One may wish to bring awareness of genuinely new experiences to others, perhaps a new style in dress (remember the miniskirt?), an unusual type of music (like rap), a recently discovered aquatic plant, or a newly-discovered insect. It may suffice to report, “I saw a new bug today, which looked a like a giant ant,” but sooner or later someone will demand a new name, a tag which will forever differentiate this new creature from all others, particularly from its close relatives!

The temporary tag may suffice momentarily: without great ceremony. *Giant ant* will be discarded. The new name may be drawn from a foreign language – probably Latin (entomology is replete with names of creatures composed from the skeleton of Latin) – and the new label will be supported by many descriptions and endless photographs and fine pencil drawings of its gross features as well as its fine structures in the fashion of Dürer. But this habit of Latinizing entomological entities may change if more and more entomologists come from China: a Mao-beetle perhaps? It is not the name that matters, but the detailed descriptions and its presumed relationship to other species.

Scientific terms, furthermore, are less prone to ambiguity because the number of people using such terms is relatively small. Also there exists a culture of respect for definitions which is significantly different from what happens in everyday language, from the language of the street, of the home and the workplace.

3) The Inside-Outside Phenomenon

The third reason for importing terms from a foreign language may appear trivial, yet it is nonetheless historically important. People who share an interest in some activity (e.g., golf) or set of objects (e.g., butterflies) may consolidate and develop their own lingo which sets them off from others. They become serendipitously an insider group – and create an outsider group, those billions on the other side of their fence. No better example than the military or comparable organization, like Boy Scouts.

Furthermore, when one section of a community wants to establish a special position for itself it often does so by also – and inadvertently – developing an “insider language.” (As a former Brit I have never understood the jargon of football.) Many secret societies do so (Freemasons, for example). I find myself often using the phrase *en passant* when the English expression *in passing* would do just as well: my generation of academics were prone to this form of elitism. It was not a matter whether one “imported” words, but from whence the import came! In my case, the barriers established through parental training broke down during teen-hood and the language of the age-group overrode many social class barriers, a process which could work in both directions – and of course often did. Rule: social groups develop their own speech and jealously guard its borders.

Of these three factors reviewed the second seems to me more interesting. It suggests that terms are imported into a daily language whenever new phenomena are identified and become locally important. In my youth the car was king; today the focus is on electronic devices, including communication devices and robotics. Both have produced a glut of new words unfathomable to my European grandparents (1870-1940). Of course, new terms do not have to be borrowed or pilfered from another language, as suggested earlier, but could be wrought from different roots, from both foreign and from within the home-language, as *apps* is; even from two roots within the home-language. The term *vacuum-cleaner* illustrates this. It consists of two already familiar words which later were fused into one.

royallabor*Vacuum-cleaner*: a borderline case

One could object to my example of *vaccum-cleaner* as a combination of two collegial words and argue that it is a hybrid of a foreign and a colloquial word. The term *vacuum*, one could argue, has distinct foreign origins. It was certainly in use at the turn of the 18th century but in a very restricted way. During the mid 18th century it became more widely known amongst the “educated” males in Britain, who recognized that *vacuum* referred to space unoccupied by matter (and whose root was the Latin vacuus = empty). To understand the term *vacuum-cleaner* only required that the user realized that a vacuum creates a suction and that it is the suction induced by a motor (?) which accounts for lifting dirt from a floor! I suppose the term *suction-cleaner* would have done equally well as *vacuum-cleaner* but we cannot reverse preferences of fashion or the order in which these appeared in history! *Vacuum* was certainly more “elegant” than *suction* – and this may have played a role in the choice of words. What is of interest is that a term was selected which had minimal ambiguity and which could be given a “clear” and “clean” definition.

No Utopias, Please

We easily slip into what I characterize as our “Utopian mood”, a time when we dream about what an ideal society would look like and what it could do for us. Utopia was invented by Sir Thomas Moore (1516) to depict a perfect society, one without stress, injustice and without overwhelming poverty — a nowhere, or a somewhere over the rainbow!

The Moore idea has often been revisited since its invention in the early 16th century. It attracts attention whenever our social and political conditions deteriorate beyond what we regard as tolerable. The idea of a society which would be perfectly suited to our needs was first raised by Plato in The Republic (circa 360 BCE), but one could argue that the story was preceded by the Biblical account of Paradise, which we furthermore forfeited through our cupidity. Paradise had the additional advantage that it was inhabited by only two humans and was therefore shielded from the evil of others, from rivals, competitors and polluters. As the story of Cain and Abel shows, Humans as a group don’t develop into a peaceful ensemble.

The major difference between The Republic and Paradise was that Plato advocated a cure for human ailments, for a tomorrow and for an alternative to present-day woes, whereas the Garden of Eden is the story of a paradise lost through man’s greatest evil: supposedly his insatiable curiosity, which was offensive to the Creator. So says the Bible.

What individuals like myself regard as among our finest attributes (along with others from Plato to B.F. Skinner in Walden II, 1948) — namely our curiosity and our perseverance in pursuing it — the Hebrews regard as venal and a punishable sin. It has been so regarded by every authoritarian government since, and for good reasons. The biblical account of the Fall of Man (what Fall?) is a story of the denial of an existing Utopia and not of concrete plans to better the world. From Eden to Hell is not a political agenda, but a tale about our condemnation for being curious and inquiring. We are being condemned for qualities which — in most educated human eyes — are entirely positive, specifically those of being innovative, curious, and creative. These are the main qualities that can ameliorate our burdensome and problematic conditions and stand in sharp contrast to those of sitting resigned and mourning a long lost past which it seems we never had and which, all things considered, are hardly admirable.

But what is wrong with our Utopias, the many we have construed throughout our history, about which poets have written, minstrels have sung, and about which composers like Beethoven in the finale of his 9th symphony anguished?

All have in common the idea that there is a perfect state of individual and social existence. Many have glorified the idea that human happiness and contentment depends upon social justice, the rule of law and freedom from hardship and want. We rarely crave for affluence but often express a right to live without compulsion from others. Each epoch has come forward with its own proposals on how a perfect society can be assured, although the lists about how to accomplish such goals do not necessarily agree. However, these have in common that they deplore the things which are wrong! In short, we do not agree with what is right — but we appear to agree about what may be wrong.

The task ahead is not simple, because we do not know how to root out those things we view as undesirable — except by prohibitions and controls! Is it because we have not yet mastered the secret of how to co-operate to achieve common goals, or how to set these or how to formulate the principles which will assist us so that we can discover common goals?

If we know what is woefully wrong with things as they are, can we discover the means through which an accord can be reached on how to find appropriate remedies for our ills. I suggest that the task is less general and sweeping but that we start our search for plausible — not ideal — remedies. Everlasting cures have a habit of turning into festering sores and do not yield good prescriptions.

There are remedies which ease pain and those that defeat the disease that ails us. Clearly our need is for the latter. And it may not be an exercise in good leadership to offer an inflexible menu for the unforeseeable future. Remedies are for present ills, not for an everlasting life. Getting things right for the current condition makes for a good start but we also need to be mindful that there is always a future beyond the present which we cannot foresee and therefore cannot take care of. The future is largely unexpected because we are so curious and creative. Prescriptions for the long-term future take away the option of choice for others yet unborn who need to have a voice in their own affairs. Utopias are not for me.

russia-soviet-poster-1920-grangerMy main objection to Utopias then is that these represent more than a critique of the present — and of our past — but that these advocate a future which does not consider the unforeseen consequences of changes advocated by the proponents of each Utopia. Utopias may be totalitarian and too prescriptive in how they advocate desirable ends are to be achieved. If this is not part of their initial intent it may nevertheless be the consequence. Karl Popper certainly thought so about Plato’s prescriptions and thought that the remedy proposed by him worse than the illness which had overcome Athenian society (Popper, The Open Society and its Enemies, 1948, Routlege).

Plato’s plans for a republic promised the permanent enslavement of all who failed their childhood intelligence tests! A similar critique could be made about Marxian Utopias, of societies based on the principle that the control of economic resources should be placed in the hands of an all-powerful “people’s congress.” We know what that looks like and what it leads to — a dictatorship of the ambitious (see also George Orwell’s Animal Farm, 1945; and 1984, 1948 for treatments of this theme).

By all means, let us have more Utopias, but we should also make sure that we have many Hyde Park Speakers’ Corners, where different proposals can be aired without being enforced. Some proposals surely need as much fresh air as we can spare.

Two or More Cultures?

My earlier entry on Clarification and Definition is one of many which reflect my long standing interest in philosophy, particularly how my own major discipline, experimental psychology, has been influenced by ideas of Western philosophy.

Now that I am retired and have no laboratory to retreat to and no white-coated laboratory associates to hang out with, I spend much of my time writing about issues which have always interested me, yet which are often broader than those dealt with in a research setting. These interests stretch over a wide range: art, theatre, music, cultural history, as well as the natural and the human and behavioral sciences. I have never been a “one culture” person, as outlined in C. P. Snow’s celebrated BBC Reid Lectures of 1959 on “The Two Cultures,” but like so many others of my generation I combined a strong dominant interest in my profession as well as in aspects of the general culture of which I am a part. I see no conflict between being intensely interested in modern technology and its sister, contemporary science, and retaining a healthy passion for traditional cultural activities and its wondrous artifacts.

Image: Matt Collins, see below for credit

A strong interest in both cultures therefore seems to me to be perfectly compatible with living in the 21st century. One can keep in step with both worlds and accommodate to the extent that is possible with the rapid changes in the world of science and the increasing pace in all aspects of our culture. I often feel like a child in a toy-shop waiting for the toy-maker to bring out more from his presumably messy workshop. The old is being eroded and the face of the new barely distinguishable through the dust of our old demolished Europeanized world. We face not one or two cultures in the future, but a multiverse. It may be something to look forward to for those brave enough to face the choices.*

C. P. Snow, whose novel The Masters was a brilliantly vivid portrait of the life of Oxonian an Cambrian academics and its students before “the Fall” shows how significantly changed we have become since the collapse of Europe. Our universities are in disarray and our vertical culture, too. “Downton Abbey” is down, shabby, and condemned to extinction as are all who lived in it. A terrible culture when looked at through the naked eye, a monster when viewed through critical eyes. Is this the culture which I see before me whose virtues are praised in much of the literature during the rule of the last century? Are we not misguided to hanker after a culture whose greatest achievements for three hundred years was nationalism and colonialism and endless wars? Undoubtedly Science, Literature and the Arts emerged in splendour out of this troubled sea – like a Botticellian Venus — but did so with heavy price.

We need a better understanding of ourselves and our world to get to the other side of this great divide between our past and our future. Can we do so by learning from past errors? I evidently think so. It involves clarification, analysis and criticism and this in turn requires us to hanker after brave new worlds, not dilapidated chintz. One cannot predict visions of the future: children’s comics do so but combine fascinating possibilities with monstrous visions of barely imaginable mayhem. The comics for adults only increase the mayhem but also reveal the vivid blend of the imaginable with the real. We are remarkably good at creating monsters, at depicting the faces of evil — but we also have an aptitude for implying what is wholesome, what should be selected from all the visions we have created of the future. We define future possibilities, as Hieronymous Bosch, Jules Verne, or Mary Shelley did, but we also clarify which of these options are desirable and achievable. That is the job we will always have to do; it is the price of being creative and inventive.

* These lectures were later published in book form under the title The Two Cultures and a few years later in 1963 in revised form as A Second Look: An Expanded Version of The Two Cultures and the Scientific Revolution (1963).

**The image of C. P. Snow atop a bridge between the cultures is from a 2009 Scientific American article, An Update on C. P. Snow’s “Two Cultures”, by Lawrence M. Krauss.