As most of you know, our beloved mentor Dr. Harry M. B. Hurwitz passed away in August 2018. We have come across the following note, dated June 2011, and would like to share it with you.

I recently wrote a set of essays, each of which I dedicated to a particular friend. He is one of many no longer here, but like others he lives in full colour with me, in my memory. Those friends who are still alive may not recall me with equal vigour or affection as I do them, but this does not matter since affection does not demand reciprocation. Furthermore memories are fickle – one person’s belies that of another. In many cases I have outlived my friends by happenstance, with the aid of good genes and excellent doctors. But in other cases death comes unannounced and unexpectedly, leaving families and intimate friends distraught and diminished.

I miss my friends and would have wished them everlasting life. But wishes are not a factor controlling the rhythm of our existence. One learns rather painfully to be content with memories and to celebrate these often and on appropriate occasions.

Dr. Harry M. B. Hurwitz — June, 2011

We miss you, Harry.

Clarification & Definition

The question “what does a philosopher do” or the complementary question “what does a dentist do” are not answered by giving a narrow, restrictive definition of the two critical terms needing definition, but by offering a clarification of the meaning of a particular unfamiliar term or phrase.

ClarifyTo do so effectively may require that each term is placed in as many contexts as possible. It woud demonstrate to a foreigner unfamiliar with the term, its breadth of use. The foreigner may then search within their home-language for comparable term which is similar in sound and thereupon take the plunge aware that he/she may indeed have guessed incorrectly!

Both terms cited are names of occupations — what a person does in their occupational life during working hours. An answer would therefore consist of a sentence or two which identifies what each group of people carrying the label do as wage-earners.

There is considerable room for errors when answering each question. Yet, within limits, any answer given would be open to correction or modification. (Note: We tend to be tolerant towards outsiders when it comes to language-use — we forgive them their trespasses!)

In what way does a clarification differ from the definition of a term? Every clarification is an attempt to explain to someone who admits that they do not yet understand the meaning of a term, in what way the unknown, unclear term is used by others who — it is assumed — are already familiar with it and its common (even several!) uses. We asked the question originally because we realised at that moment, we are not yet privy to the term as it as being used by others. In truth, we wish to participate in a conversation and therefore realized that to do so we have to understand what others are saying, that we generally also use a context to facilitate or aid our understanding what partiucular events are being referred to, what is being named or what quality of an object or event is currently the focus of interest for others, the current centre of their concerns.

Clearly offering a definition to someone who has actually asked for a clarification may help that person, but the definition may itself contain elements which are not understood by the interlocutor. He/she may come back to request further clarifications and may do so until every term used in a definition is understood or until the definitional sentences themselves are fully understood.

Furthermore, when people ask for the clarification of what is for them currently an unfamiliar word, term or expression, they expect us to stake out the characteristics of what is confusing or unfamiliar to them! Only then do we say, “I now fully understand” (and often also add a sight of relief)!

Much of what we say in an explanation will be quite clear except for a particular (target) term or terms. A definition, then, may help us to some extent, but not on all occasions. The puzzling term may already be familiar to someone, but not the context in which it is being used on this particular occasion. Of course, people bring a vocabulary to every discussion (unless they happen to be foreigners whose language has no overlap with the language being used). If a language of words is unavailable to parties of an exchange or conversation, such people would indeed be severely handicapped and may be forced into exchanging even elementary ideas like *right* or *left*, *up* or *down* by resorting to other forms of commuication than words, much as earlier European explorers did during voyages of exploration and discovery in the Americas and in the Far East during the 15th and 16th century. These voyagers employed gestures or even acted out their ideas, wishes and proposals!

Gestures between humans has always been helpful but does not promote discussions about ideas. Modern humans live in an environment which may be described as consisting of references to objects and items which are products of human invention and whose uniqueness is given by their appearance and functionality, that is, by our ability to have learned how an object is different in kind from another by virtue of its context.

A prime example which comes to my mind is the ubiquitous button or switch whose functionality is associated with what operation it was programmed to control. The button on my electric dryer is the same as on my car — but its functionality is totally different and non-comparable. Much depends on the preparedness of the questioner to be taken into a field of knowledge with which they are already familiar, for which they may aleady have even a rudimentary vocabulary.

The contemporary world today is so stocked with “knowledge” about diverse matters that most of us are truly ignorant, although many are prepared to learn and to add to both our existing knowledge and vocabulary! Ignorance can be remedied and more people than ever are prepared to do so than in the past. We have all somehow learned that errors and absence of knowledge is too widespread and often astonishingly common, so that it is our individual responsibility to remedy this (lamentable) state of affairs whenever possible. Many of us do.

We do not usually add to our knowledge of things and events by learning (memorising) definitions. Learning definitions — whether by rote or in some other way — has its uses but it is a method useful in specialised contexts only. Most of us learn to offer a definition upon request, e.g., the definition of a soup-spoon, in contrast to a tea-spoon. The definition of the latter would not be covered by “a smaller version of a soup spoon”, or “a spoon used to stir a tea-pot”, whereas “a spoon smaller than the normal soup-spoon and used in a variety of situations where a small spoon may be useful, like eating a cup of berries, would be useful.

In short, a clarifications serve to help us learn the meaning of a term and includes that one learns in what context the troubling word is used most frequently, but also when it is used rarely. A definition of a word is more narrowly aimed. First and foremost it serves the purpose of informing us about a word’s restricted reference, that any word may have a widerange of uses and meanings. It is a more advanced undertaking which often requires that the learning of the new word also involves how to use it figuratively, that is, analogically.

Newton’s Atomic Theory and the Divine Will

Newton advocated the view that atoms, conceived as small indivisible particles of matter (substance) and the void together accounted for the physical world as we, its observers, experienced it. A similar suggestion had been made two thousand year earlier by the Greek natural philosophers, e.g. Democritus (c. 400 BC), and was reaffirmed repeatedly by other Western philosophers without adding any observations or experimental investigations to support this view.

It had received support from both Plato and Aristotle although it was clear to them — and to others — that the supposition that the world was constituted in such a manner would have to remain entirely speculative. Methods to test whether indeed such “entitites” as atoms could be discovered — were not available. However there were some who believed that sooner or later suitable instruments would be invented to do so, a hope which helped sustain the wide-spread cosmological view that nature in its physical manifestations could be “revealed”, that the appropriate information would ultimately become available to us.

This optimistic notion received support from discoveries during the mid-17th century that microscopic single-cell organisms had been observed and studied by Leewenhoek in Delft, Holland, by adapting the earlier discovered telescope which had been successfully used to studied far-distant celestial bodies. It therefore seemed plausible to some to also assume that whatever governs the motions of stellar bodies also govern those on earth, that the quest for a unified, general theory of why and how bodies suspended in any media could be satisfied. Newton certainly cast his support for this position, as shown in his comments in his widely studied thesis on “Opticks” (1704-1706), a work which was to dominate natural philosophy for the next few hundred years.

Newton wrote:

It seems probable to me that God in the beginning formed matter in solid moveable particle of such sizes and figure and with such other properties and in such other propertied, and in such proportions to space, as most conduced to the end for which He formed them; and that these primitive particles being solid are incomparably harder than any porous bodies compounded of them, even so very hard as never to wear or break in pieces; no ordinary power being able to divide what God had made in one in the first creation. While the particles continue entire, they may compose bodies of one and the same nature and texture in all ages; but should they wear away or break in pieces the nature of things depending on them would be changed….And therefore that nature may be lasting, the changes of corporeal things are to be placed only in the various separations and new associations and motions of these permanent particles; compounded bodies being apt to break not in the midst of solid particles, but where those particles are laid together and only touch in a few points.

Readers will see to what extent this passage from Newton confirms his life-long commitment to a theological position which he had taken over from official dogma and doctrine. He not only refers to the pivotal position of the Hebrew God as a Creator of the universe but also his conviction — also found in Aristotle’s metaphysical writings — that whatever is observed as a genuine item of this world is part of a teleological narrative. It is the message articulate earlier — and better — that “There is a divinity that shapes our ends, rough hew them how we will” although its truth is equally debatable ( “Hamlet”, Act V) .

This narrative is not only pre-emptive, but strikes in advance of any pertinent evidence which could call it into question. It is formulated in a manner which cannot be contested either logically — or whose contradiction could be formulated in a way that makes these into issues which could be tested (checked) by physicalistic means, i.e. experimentally and in accordance with the strictures of experimental science, which Newton and many of his contemporaries believed were foundational to studying the “Book of Nature”.

The Form of Things: A Timeless Problem

Entities Part 2B: A note on THE FORM OF THINGS: a timeless problem

What allows us to recognize “bare and higher truths”, truths that emerge only after we have logically tested conclusions against specific principles, those we agree would also lay bare falsehoods? Every truth publicly declared also declares its opposite, namely, what is false. It creates a universe of opposites — of truths and falsehoods — which may lead to the conclusion that whatever is NOT true, must be false!

One may call this imperative falsity, that something is held to be untrue/false because its truth-value has been inferred, not empirically demonstrated. Because we lack reliable, tested, or documented means which allow us to decide between what is false and what is true (ultimately) it may force us logically to talk about “possibilities” rather than with “truths”.

This inadvertently creates a new universe, the world of Possibilities. Possibilities, however, are a form of “speculations”, something done by humans, which is quite strong in some individuals although weak — even absent — in most others. An example of a speculative product is the notion, at first timidly suggested by Greeks during the Iron Age (prior to 600 BC) that the world as known to them had a past, a history during which humans interacted with Gods, who were themselves somewhat “human-like” except that they were opined to have greater powers of control over many more features of the world, e.g. the condition of the seas, the calmness or turmoil of the “elements”, the life and death of other creatures including their existence, even the extinction of all living things. Enter an early version of “Science Fiction”!

Thus, the atoms of Democritus, small, unobservable, and presumably indestructible entities invented c. 600 BC, were entirely speculative. Their nature was unknown but were guessed at. Democritus put out ideas, based on speculations of earlier thinkers, including seers and poets, which others of course were free to accept on the basis of its intuitive appeal, as well as any force of arguments advanced in their favour!

Democritus had very few followers, but his ideas resonated with other teachers and poets, and did not, as far as is known, draw ire from the religious establishment of his time. Thought-control — although rampant and widely practised throughout preliterate societies — was mainly enforced about matters which affected public policy and religious matters. However, efforts to divert discordant ideas and expunge these from the market place required some form of “thought police”.

As the later history of the Jews shows, the formation and acceptance of a thought-police required a shift in the social status of “prophets” from individuals who were accepted as spokespeople of a God — like Moses was — or of Gods — or of being super-natural forces themselves, those who loudly proclaimed themselves to be superior to existing authorities, on the grounds that they were directly in contact with supernatural forces.

From phenomenological, personal insights to possessing public knowledge: the impossible path

The problem which early thinkers addressed was how the complex phenomenological world — the world of daily experience, the world of our encounters which we claim to know — could be construed from visible but also invisible, inherently insensate, events, the things seen and those behind the screen! How do we develop ideas about a world which contains both solids and ephemerals, which could also influence, or “cause”, other events? “The gods at play” are of the latter kind: we guess at their existence on the evidence that things are not as anticipated.

Our anticipations are based on what has happened reliably in the past: however, we need to distinguish *signs* from *omens*. *Signs* presage a future based on earlier experience. *Omens* on the other hand, indicate that the future is likely — or is apt — to depart from the past, that the future is not like the past. So we start from the outset with a view of a corruptible world, where the future is not necessarily like the past.

We do not know what accounts for corruptions. Is there an answer? Early thinkers were bold enough to answer such questions affirmatively: they often were certain that their unique answers were correct! But such guesses were based on insupportable sources of inspiration — and therefore could not survive criticisms. These would fall apart whenever visions of the future fail to materialize which has been all to often — and with great frequency.

Comments on foundational substances

Plato, as successor to Pythagoras, and shortly thereafter Aristotle — thought by many as perhaps the most influential philosopher in Western history and himself a former student of Plato — suggested that regardless of conjectures about foundational substances — the building blocks, like the atoms of Democritus — there were also forms.

Forms serve to give structure to the perceived (subjective) world, a world which necessarily include objects. Some objects were pre-determined, whereas other were “construed”, or seen as themselves products of the elusive mind. But which?

One could argue that structure, the form objects assume, was inherent in these, or one could assume that forms was an attribute assigned to events by the “mind”. Of course, the concept of “becoming assigned” was itself problematic and generated much debate for the next two thousand years. Structure — it seemed to many — was something which was imposed on raw materials, on the analogy that the statue of Athena in Athens was hewn from formless stone. Indeed this analogy is deeply embedded also in the story of the Creation as told by many peoples during the Iron Age, and which was also recorded in their enduring myths.

It seems such myths are part of the history of our own current search for explanations, the search for what is, how things are, and how things become over time: “the past, present, and the indefinite future” as this applies to any event. It appears that during an undetermined earlier moment in our past we transitioned from accepting that some events were indeed time-bound, whereas others others were not. It assumed that some matters were “basic” and “fundamental”, that these events owed their origin to super-human or pre-human agents.

Its history therefore remains beyond our reach — and an explanation for its existence remains on the front-burner of human inquiry, even as I write. We somehow expect that the answers to our root-questions can be gotten. It seems that for the moment, we overlook that any questions raised are themselves culturally determined and that answers to these are therefore “cultural products”, which come and go with time and fashion. (To be sure, there are no fundamental questions — only passing ones and answers: each make their entrances and exits.)

Comments on structure and form

A final word about structure and content: From the point of view outlined so far, the concept of Structure is not self-supporting but is part of a duo: Structure and content are viewed as facets of how we perceive our world. (Note: see also comments in other blogs on “cognize.)

Think of *left* and *right*. But there is also *up* and *down*— and these four concepts define one version of space. The world appears fractionated to us because we employ this perceptual stratagem which permits us to focus on two, four, or more aspects of any experience without regard to raising issue of the origin, or future of the event.

As a result, we invariably create (construct) a world which has self-imposed, limited, dimensions and we therefore deliberately omit two of these — namely, change and passage of time. This creates a contrary-to-fact stance: namely that time stands still, can be tethered but also that the structure of an event can be viewed as timeless, and is close to an “enduring reality”. But is this not a case of the Humpty-Dumpty problem: how to put the pieces together retrospectively, post-hoc, after fall?

One solution may be to accept what had happened and only then back-track, to a pre-event period, before one attempts to reconstruct the world as it was before its fall off the wall and before we construct an alternative end-game, a narrative of its future. In doing so we accept that as observers we have the capacity to write alternative scenarios, no matter at what point the old story was interrupted and diverted.

If form is viewed as that feature of a narrative which gives a story its logical coherence — its rationale — it should be easy to see that any narrative consists of a series of vignettes which could occur in any sequence, or in any order except for the order itself. A story may emerge, but this may happen by chance, like the famed chimps pounding a keyboard and producing the text of Hamlet. Perhaps — but most unlikely.

Structure, it therefore seems, is a property of all things, should be viewed as a universal quality. The world is inconceivable (but only by us) without it, but no more so than a world which is bereft of distinct “instances”. Nevertheless, this world is a convenient fiction whose convenience- value needs to be clearly stated in terms which include the historical moment itself. Thus both Forms and its complementary notion, Substances, are inherently stable but only within limits. That is the conclusion reached here — and it bypasses the religious (pre-empirical) catechism that this is as was ordained! Our conclusion: our perceptions are constrained, but not ordained.

Nouns as Contrasts: Opposites and Differences

There are many terms that draw contrasts between opposites and many terms which are used to distinguish degrees of difference between things which are alike.

The terms “light” and “dark”, or “hot” and “cold” refer to relative contrasts; whereas “dead” and “alive” refer to opposite and exclusive contrasts of states, or conditions. We also contrast between events by using the adjectival form of a noun, as when we refer to a piece of bread — unquestionably an object — as “stone-hard” or perhaps as “doughy”. In short, nouns are often adapted to serve as adjectives, as qualifiers.

“Dough” is an object-name; “doughy” a quality ascribed to dough. In its adjectival form the word establishes a link between certain accepted features of an object and some other event which does not carry the description implied by the adjective. One may not oneself know all the attributes of a stone — a task assigned to geologist — but there are conventions that apply at moments in our history which allow us to create a “list of attributes” deemed appropriate to the object being discussed. When one attributes a new quality to a familiar object, previous meanings of this “object “ become modified and extended. This is dramatically illustrated by contemporary dictionaries dedicated to slang!

Which come first, the noun or the adjective? I suspect nouns do: these are often names of objects and as such serve as primary signals. But whether names come first or not is not a matter of importance. It may be easier to teach some species that a sound signifies an action rather than an object; similarly some species may more likely learn to associate a sound with an object, or that there may be stages in development which favor the acquisition of an association of a signal with an object rather than with distinct actions which the subject is required to perform. Is there a general rule which applies to every species or are we talking about species-related matters, a la Ethologists? Pavlov and many others declared the former — but their assumption is not longer accepted as a general rule. Most modern biologists prefer the ethological position that species are quite limited in their perceptual and behavioral repertoire and that such limitations reflect much about their evolutionary history as a species.

What we need to be clear about is that our language from childhood onward distinguishes between names of objects — however such “objects” may be defined — and the likeness these objects have to each other — or to other features of our experience, features for which we already have forged names. Thus, “apples” are different from “oranges” and both are different from stones. The former two are edible (when ripe) whereas stones are never perceived as edible. Apple-seeds and orange-pips however are edible, although these could be mistaken as “small stones”! Each of us language-users learn which things are alike (or similar), to what degree this is so, and wherein lie the differences between them. It avoids confusion humans learn from an early age what “likeness” means and under what conditions one takes precautions against assuming that likeness is the same as equivalence. It is not always easy to do so.

ELI5 — Explain Like I’m 5

Until last week I did not know the meaning of the acronym ELI5 — “explain like I’m five.” I’m convinced that this request sets the hurdles very high for me but the goal is worth it. Five-year-olds are actually past the time when their main interaction with adult is “why?!”. If they do not understand an adult at that age they may just blink and walk away, or wrinkle their nose.

eli5I shall therefore compose several pieces in the ELI5 mode. Hopefully adult readers won’t walk away. Please score me 1-10 — the higher the score, the less I have succeeded in stating my case. Give me a 1 if I have succeeded beyond my wildest dreams, and 10 if I have failed miserably, or anything in between.

What does *explain* itself mean? Why is “why” asked by children or adults? What is expected of me, who is asked that question? Should pause,estimate the age of the questioner and then proceed. My reply should be adjusted to my guess of the age of my interlocutors.

A five-year-old does not ask only to annoy because he/she knows it teases! The question is not asked to gain an adult’s attention or to get him/her away from whatever they are currently doing, like reading a magazine or watching a TV program which is likely to be boring to smart kids. I think it is asked for several reasons. Thus, the kid may not yet be aware that one need to be very specific in how to formulate a question when asking adults, that it is different from asking other kids the same question. Kids do what adults also do: they ask questions in contexts and therefore assume that the person addressed can fill in all the blanks omitted.

Take the case of dropping a glass of water. The glass shatters. WHY? EXPLAIN! The question is probably not what an adult would ask: what was it about the glass or the floor which shattered the glass. Was the glass empty, half full, full? Was the floor carpeted? Wood? Cement? Were my hands wet, greasy? Was I inebriated, or do I suffer from palsy? None of this information would be helpful to the child! Say “the glass was wet and slippery” — it may be sufficient. Now wait for the next “why”! It will however be a different question — the continuation of a social encounter.

The Writing of History

Writers write for audiences which are presumably well defined in their minds. Their manuscripts are narratives addressed to their imaginary circle of friends and admirers who they entertain and cajole by the twists and turns of their extended tales.

Historians, on the other hand, have three audiences, and they address their comments to each in turn.


First, each historian speaks to his predecessors, corrects them where deemed necessary because of their inadvertent exclusions or misinterpretations. If exclusions were “inexcusable” because the data was readily available to them with a little additional effort, it is likely that the culprit will receive a shellacking and be condemned. More often, however, exclusions are due to the stark fact that new data appeared and filled in details previously missing, rather than force continuity upon earlier writers who had relied on their historical imagination, who substituted conjecture for missing data.

The second audience of a historian are his/her contemporaries, many of whom are assumed to be already familiar with the episodes to be discussed or the main characters of the narratives. Often it happens that the themes discussed are “modish”, are driven by contemporary problems. What roles did “spies” play in earlier periods? Did they exist at all and what credence was placed on their testimonies? Recently there has been a spate of books and television programmes on Henry VIII and his chequered times, yet few readers or viewers will be familiar with the life and times of many of its “minor” characters, including its spies, who nevertheless contributed to the “story-line”. Christopher Marlow, Antony Bacon or Sir Francis Walsingham were all actors engaged in spying, but the first two had minor parts whereas Walsingham was prominent as a defender of Protestantism and of English/Scottish independence from European hegemony, and had his “men” placed in several capitals of Europe.

The third audience a historian has in mind when writing is less palpable: unborn generations of readers who view their received legacy through the eyes of past writers of note. The historian see himself/herself as someone who may influence the future by providing an interpretation and a record of this past. Their influence may be minor or, on the model of Gibbons amongst others, be writers of considerable significance. It may be short-term effect — as in the case of H.G. Wells — or long-term, as Julius Caesar or Cicero have been. Generations of British leaders were schooled in both these classical authors during their school-days and as university students.

I am sure some writers do not recognize themselves as aspiring to such lofty heights but view themselves as journeymen, not prophets. All honour to such writers and purveyors of “truths”. Let us agree, furthermore, that historians are ill-advised to see themselves as politicians, as influencing by their writings or teaching the distant future, except indirectly. For the truth is that historians are ill suited to that task and — generally speaking — they are more honourable in their intent than most politicians. The interpretation of the past may influence our future actions as individuals or as communities, but how this plays out remains a puzzle and is best left to future historians to discern for each historical period.

The Case of Humpty Dumpty’s Singular Use of Words

“When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean — neither more nor less.”
“The question is,” said Alice, “whether you can make words mean so many different things.”
“The question is,” said Humpty Dumpty, “which is to be master — that’s all.”

Both Alice and Humpty Dumpty have a strong case. Alice is justified in her complaint that words can and indeed often do mean many different things and that this can be most confusing — and can be injurious. More confusing to some than others. There are those who simply cannot cope with confusion. Psychologists have referred to this as the fear of ambiguity. Like other fears it terrifies and often paralyzes people.

Consequently there are those who try to use words and sentences in a way which reduces ambiguity for no other reason that they wish to be clearly understood by others and that they want to understand what others are saying,i.e. interpreting messages correctly. Often such messages are trivial, but what if in fact such messages are of utmost importance? As in, “Call 911! Your house is on fire!”

Humpty Dumpty’s proposal that words are assigned a restricted meaning by fiat — “it means what I wish it to mean” — would result in an the construction of an endless dictionary. Everyone would then have the right — even obligation — to usurp the meaning of a word which already exists and entered into their idiosyncratic dictionary! We would be forced to live in a world of neologisms and would need also need all the sounds we could produce with our versatile larynx to cover our needs to communicate our thoughts to others! Babylon will have come to pass. Existing languages would be wiped out! An unimaginable event. Just imagine the effect on our dictionaries.

The alternative to this bizarre doomsday scenario would be to deplete our vocabulary as far as possible,to whittle it down to the smallest number of items. Isn’t this familiar? Basic English was once favoured by one of the finest exponents of current English, Winston Churchill. Rewrite Shkespeare even Jane Austen into B.E. and translate Tolstoy also!

But the current trend is actually diametrically opposed: we actually need more and more words to name things and to identify the many new items we have introduced into the world; the many new objects we have discovered or designed and manufactured; to mark novel events which form our personal and public experience and which help us to identify new norms; as well as the multitude of new ideas we have developed.

There are many people, dead and alive, who share my name — but I reckon few who also carry my initials, H.M.B. — and fewer, if any, who also share my birthday and birth year. These are all markers of my identity. So using this method, Humpty Dumpty’s complaint could be taken care of, simply by embellishing existing words, by adding descriptions when appropriate or desirable, or by giving each set of words a contexts.

We do this already: “Yonder horse” is not any old mare, but a particular horse, perhaps the horse standing on the meadows nibbling grass, or kicking its heels. In this case *yonder* is not an object description, but one that identifies the horse by its context, rather than by such unique peculiarities as as the length of its mane.

But I appear to have urged a tedious solution. If either Alice or Humpty Dumpty remain concerned that their use of a word may be misunderstood, they could hang out a warning sign: “This word is in my lexicon — but may not be in yours” — let’s say “ML” for short. It would distinguish this word from “this word is in our dictionary,” or “OD.”

“ML” refers to words which already exist in some language but which are also used in the lexicon which contains all the words I use — and understand — and which adds to each what I mean by the term. “OD” permits words to be used in a variety of ways and even encourages and promotes ambiguity, as is demonstrated by the number of homonyms cited in the average sized dictionary. (Note: they all cite homonyms, a confession that words have multiple meanings, and that they are ambiguous.)

Usually a competent dictionary insists that there context must be considered — something which restricts how the word listed is to be used. It represents a form of learning which can even be demonstrated by the rat and certainly by many birds. Why not then by little bright middle class girls — and even by other Humpty Dumpties, on or off the wall? Each would be perfectly competent to learn how to use a word in different contexts without getting into a state of total confusion or self—centredness.

Some learn this with little training. There are always exceptions even to this rule. By the way — how many Humpty Dumpties are there?

A Clarification is Not a Definition

Some words are used interchangeably. This certainly enriches a languages by giving it alternative ways of expressing the same idea — or so it is believed! Not so. An idea, object, or event (three different non-alternative entities!) may go under different names, as when a married woman refers to herself both by her maiden and her husband’s name, or when a person is described as married or wedded, or a diamond is described as a stone or gem, or an engine-propelled boat is referred to as a steamer or liner.

shipsIn some cases the names were wrought at different times and their names reflect this. A steamer clearly is a boat whose engines are driven by a mechanism whose pistons are moved by steam which is produced by burning coal or oil, whereas a liner is a bigger boat whose propellers or pressure operated valves may be moved by turbines and which carries passengers across oceans, not on rivers and lakes.

Language can be so rich — grammar arid.

A definition is a device used rigidly or loosely, which is to it intended to “fix” a word, to assign it meaning. Rigid definitions are preferred in scientific disciplines where ambiguity is deliberately shunned. Accordingly, a word cannot (should not!) have two or more uses. Each label on the bottle must be completely unambiguous, have only one interpretation and fully describe or identify the content of the bottle. Most dictionary definitions, by contrast, are nonrigid, sometimes even fuzzy. What such definitions accomplish is to draw a circle around a claim. Inside and outside the circle are entirely and insufferably different. These demarcate — inform us — what a word means — but also what it does not mean! (Most of us live in a bi-modal world most — but not all — of the time!)

A clarification as a device tells us, in words we are expected to understand, what an unknown or ambiguous word or expression could mean and what it probably means in the context provided. A clarification may ease us into understanding a sentence in which the problematic word occurred! It does not scream at us “You used the word or expression incorrectly, falsely, heinously!” but indicates that we trespassed on an existing good usage of a word. It says: “No! it is no a liner, but a paddle steamer”. It has explained, has clarified that the word used was not the best word to use under the prevailing circumstance, and hereby opened alternative choices to us.

The Universe as a Mystical Concept

The universe is a mystical concept. It is not the name of a fully known object, but refers to something which by definition cannot be known in the concrete.

Counter-examples: My dog Fifi and my cat Cleopatra are two household pets, each with distinct appearances and ordinary as well as some unusual habits. I can give lots of descriptions of them which will make them familiar, even recognizable to others. I could include a photograph of each and even an odometer-reading! Furthermore their existence is not in doubt, although each has problems articulating their own existence and neither can say “cogito ergo sum” or its equivalent in dog or cat language (of course I would not understand either).

What makes the universe such a problematic concept? I have called it in fact a “mystical concept”. The concept seems to be a member of a species of names of things, to which new instances can be added without necessarily changing the concept. The concept is therefore open-ended like talking about “all humans” even though additional humans are added as I write, each one presumably different from others. What does it mean then to have a “Theory of the Universe” if it cannot be confined and contrasted with other events? The answer is that a “Theory of the Universe” is an inadequate manner of stating three entirely different problems:

  1. If the term *universe* refers to all that we known about the universe so far, but not what else could be discovered about it tomorrow. If you cannot complete the list you do not have a comprehensive concept! Adding “etc., etc.” or “ad infinitum” won’t do.
  2. If we examine the methods and means we have used to specify the universe, how can we be sure that there are not other methods that could have been used to draw up and codify its features. We have a nasty suspicion that other methods could be deployed or invented.
  3. If we can formulate an explanatory theory for all we already know about the universe, what would a replacement theory to this have to be like to be a candidate to replace the reigning theory? What features would it have? What tests would the theory have to face to continue as an unchallenged, non-contentious theory?

If we accept that problems 1-3 cannot be overcome this may be because we are dealing with a mystical concept and not with a concept which covers a particular field.

In regard to #3, once we have discovered that our reigning “Theory of the Universe” has generated contradictions which have to be removed, that we may have already discovered additional matters about our universe which confound our earlier conceptions. Only theories which make provision for all contingencies are resistant to fault-finding, and these would include the presence of mystical entities and processes, like, respectively, “the universe” and “create”.

Finale: The world as I know it is not a mystical concept but contains a welcome admission that I do not, and cannot, know everything. Furthermore I cannot know all things equally well, but I can be incredibly ignorant about many things. It is a sad fact that the expression “abysmal lack of knowledge” has meaning and much application.