Part 1: Science does not prove — only declares

Note: Herewith several related comments on the role of “proof” in the contemporary natural and human sciences. A critical comment on the title of a recently-published book by A. Aczel — which carries the confusing title Why science does not disprove God (2014) — is also included. The first of my comment deals with what it is that “scientific activity” produces. The heading “Science does not prove — but only declares” summarises my conclusion. In brief, that the outcome of scientific inquiries are a series of declaration on our current knowledge about “our world” which together constitute a comprehensive representation — a momentarily authentic picture of what the world is like.

columbus-egg 2What is commonly understood by *submitting a proof* and its opposite, *submitting a disproof*? When someone submits a proof they are said to demonstrate to others how a conclusion they themself (privately) reached was obtained. More specifically, how they arrived at the conclusion by entirely logical means, and not by empirically demonstrations.

If they told us that they just “felt that the vase they had unearthed” was a Greek urn which had once contained the ashes of a fallen warrior, we would call it a “guess” but not a true discovery unless the claim was supported by much more evidence or provenance! If they “show” that something they had foretold had materialized, like that a gesture made towards heaven produced a hail of manna, that is not a proof, but only a demonstration: it shows that their prediction on this particular occasion worked! Such predictions were once made routinely by reputed “wise men” but none have been recorded reliably for the past few hundred years.

A proof, on the other hand, refers to a post-facto event which states that whatever was initially said about a matter followed logically from some earlier explicitly-cited assumptions. For example, that 2 of anything added to 3 results in 5 items. In this case there is no doubt about the existence of the numbers cited, or that the number 5 can be generated in a different ways. The assumptions may not be empirically true — often they are not! What we have here is a calculation which involves abstract, not empirically true events. Two goats standing in the meadow and three sheep grazing nearby make five animals. Contrast this to the claim made that “When I put a match to this spout of a bottle a flame will emerge”. The answer to the question “how is this possible?” will require, amongst others, a reference to specific, well-attested laws of chemistry, rules about what substances are flammable and which are not.

A proof, in short, refers to the outcome of clearly-stated logical operations. Such operations are traditionally performed only by humans — although many psychologists and biologists have argued that it is also found in some non-humans, but only in those species whose nervous system have similar features to ours, containing, for example, neural circuits, a hemispheric brain, cortex, areas which have become centres of control for specific outcomes or operations.

Much has been written and speculated about the relation between the brain as an cohesive organ, as a processor of information and how such information may eventuality translate into states of awareness and also of actions — but it is an ongoing, not a completed story — part of a book with many chapters of which only the first few have been written so far. The future, we predict, will surely offer many additional surprises, and these will be related to the fact that with time and much effort we may get to know more and more about the functional and structural properties of the brains of different species.

One enduring (and thus far unsolved) problem has been to account for corrections which are made by an individual member of a species as a result of their experiences in the past — and how this could be forwarded (transferred) to their descendents to facilitate the behaviour of unborn generations. Are there some aspects of our experiences which are coded so as to become transmittable from generation to generation, just as many bird-song do? The empirical answers to such questions will most likely emerge within the foreseeable future, but in the meantime we can only create increasingly better and superior questions and suppositions of what goes on within creatures which reflect changes in their daily lives, specifically how they come to predict some future events on the basis of their earlier experience or perhaps even by virtue of cross generational transmissions. I could imagine, for example, a mechanism whereby a set of experiences could be transmitted to several future generations so that traces of former experiences would wane and disappear. One may need to exclude carry-overs from the immediate past because these changes may only reflect temporary matters, which traditionally were covered by the term *habituations*, i.e. transitional intra-organic changes which left very few — or minimal enduring residues, for transfer to off springs.

When I state — as in the title of this blog — that “Science does not prove but only declare,” I mean that the fruits and outcomes of scientifically conducted investigations take the form of declarations which one has presented to oneself. Modern science is a communal activity whose traces are found in a group of cohorts and which usually demand that anyone who makes and accepts a new claim can and will defend it publicly in person — as at a scientific conference — or by circulating a documentary report of their investigations in a publicly available journal where it can be criticized by others!

One communicates one’s claim by issuing statements, which may contain abstract formulae that summarize both what one has done to secure the information, but also what one has concluded from such earlier work. It is a declaration of the truth as seen by oneself which is made publicly known so that it can be openly viewed and, if so deemed, criticized! The declarer admits that they could be mistaken about some or even all the summary conclusions presented, but hopes that little of what they claimed has to be withdrawn or revised as a result of criticism.

The popular statement that the “proof of the pudding lies in its eating” is therefore incorrect. The proof of an argument in particular lies in the correctness of its logical derivation, something which requires that the steps taken accord to well-stated rules. The rules predate investigations, research. One assumes that all the assumptions made in an extended argument are necessarily correct, and do not contradict other explicitly made arguments. To be correct therefore assumes that what a statement declares is independently defensible, and therefore does not depend on the correctness of an individual’s perception only. It is assumed to rely on the verification by everyone involved or concerned with the argument, that what has been claimed can also be independently supported by applying common method to the claim.

For example, several claims have been made throughout the last 1500 years that the shroud in which Jesus was wrapped after his body was taken from the cross — data which is not doubted by most — was subsequently found and is now available for public display and examination. However, each shroud so far examined (there have been several) has failed to stand up to all the tests applied, including tests of their reputed age. Thus the hypothesis that the original shroud had been found has not been supported, and cannot be affirmed with confidence, but seems to be based on a wish to believe that such a shroud exists.

Of course, such wishes have no permanent place in scientific investigations but have to be abandoned regardless of their origins. (I’m sure the priests in Egypt believed their stories of the origins of humanity, just as the early priests of Judaism believed in their, may I add, fanciful account of the origin of women! Evidence cited to support a “position” is often viewed as a distraction in such cases since nothing is stronger than the wish to believe.

My preference, therefore, has been to view each declaration in Science as a temporary, time-bound claim only. All these and similar claims are ultimately disputable — and are more than likely to be. The claims may therefore need amendment(s) or may de facto be discarded under the heading, “was of one-time interest because its claim accorded with other plausible pictures or representations available at the time.”

The World as a Picture or Collage

In an earlier blog I introduced the term *collage* and distinguished it from related nouns, picture and presentation. A collage, as commonly used, includes recognizable objects but also the arrangements and juxtapositions of items in an unexpected, spurious, curious manner. The collage itself may also include spaces between objects — blank spaces which have no identity except for their hue or lack of form, their formlessness. Look at the sky at night. Twinkles, some larger objects, some streaks of light moving at speed, respectively named stars, planets, airplanes or spacecrafts. Also much darkness, emptiness.

So when we look outside ourselves we invent names for every item we can distinguish from its indifferent (black?) background. We make special efforts to do so, to order and arrange our perceptual world. Whenever we are unwilling or unable to identify a pin-of-light, a manifestation of an object we tend to speak of “the void” — and secretly treat it as an object! But — as we have learned — today’s “void” may be tomorrow’s treasure-chest, filled with fascinating objects which hold secrets to our understanding of our universe!

The history of science illustrates how fickle we are in this regard. The history of ourselves also tells how determined we are to complete a story — a fantasy — once begun. We seek “understanding”, not only recognition. We recognize the latter, but when this fails we create objects, but also we invent processes to help and assist our understanding. This has been the pattern since Aristotle raised “understanding” as our highest goal, the hallmark of our god-like nature.

The objects we distinguish around us may have clear relations to each other. Thus, several philosophers — and more recently some scientists — have urged that we study the act of perception and other attributes ascribes to human (e.g. R.S. Peters: Motivation and D. Armstrong: Perception) more critically than our predecessors since it does not follow that everyday descriptions of ourselves, though old, are necessary faultless or correct. Common sense, it is claimed, is not a good guide in these matters. We have been repeatedly warned not to assume that our current self-descriptions and especially those of our so-called “states of mind” have greater accuracy or authority than our descriptions of “the external world” (G. Ryle: The Concept of Mind) but that these are subject to great hazards. Better to be than certain!

The trend throughout the 20th century has been to view descriptions of the external world as a scaffold which rest on the certainty of our perception of our own inner experiences, but one should remember that descriptions are invariably constrained by limits which reflect the descriptive habits of far earlier periods, periods which have promote their own “wisdom” and “habits of thought” and which are untainted by contemporary knowledge! History is only a record of our past achievements, which includes its failures to describe ourselves and our attributes well. A health system based on well-tried prescriptions from the past, list of uncritically accepted cures?

The world as a picture therefore includes some temporary successes but primarily failures to describe “matters of current interest” in terms of dated concepts. This does not mean that the pictures of the past make sense, but only that some aspects of the composite may. The paintings of Marc Chagall are replete with suggestions of self-contained episodes — and this can also be said of paintings by surrealists, yet we regard each as self-contained, not as an episode of pictures whose outlines have never been seen!

Successful achievements and failures to achieve may just happen to come together — under the same umbrella, so to speak — like pedestrians seeking temporary shelter during a flash rain-storm. In that respect these form a collage. The term *collage*, furthermore, is not currently part of the elaborate vocabulary of philosophy, or of cosmology, but is more at home in the arts than in formal disciplines. It stands for the idea that we normally judge something after “scanning”, that the idea of a moment is imprecise and covers too many judgements based on a succession of temporary impressions and viewed as a composite. In philosophy itself the term “theory” has long been been elevated to a paramount position to contrast with formlessness, with the notion that the pieces in hand cannot be assembles into a whole. The emphasis is on an “integration” of seemingly coherent parts into a wider, more comprehensible position, of bringing “ideas” together (see a classic of the genre, namely, A.N. Whitehead: The Adventure of Ideas, 1933). Perhaps *collage* should become part of the working vocabulary of philosophy?

Philosophers have often claimed that they were concerned with eternal verities, about matters which not confined by the limits of time, matters which have lasting values. As self-declared lovers of wisdom, philosophers are often assumed by others to be priests without a formal religion. They were bound to their own beliefs and therefore carried an obligation to defend these against the multitudes, the “common people”, as well as others equally skilled in handling thoughts and speculations. They were said to theorize, to discuss theories as objects, just as scientists discuss their methods of inquiry and what it is they have already achieved or hope to achieve through the rigorous application of such methods. By common consent these methods were the rewards of discovery. Their methods were viewed as tools of discovery which could be ordered, a process which demands that each move gets evaluated by agreed criteria. For scientists then, the discovery of a method was a “rightful tool” which had as much significant as a miniature screwdriver has for a watchmaker, or a needle for a tailor. (Threads or strands of fibres existed before needles!) Two centuries ago we discovered and developed the tools of statistical analysis, how aggregates of measures of a trend can be used productively and how this helped to change our studies and investigations of “natural” but also of “social” phenomena (appearances). It is useful to keep this analogy in mind whenever we discuss “science” and what it suggests to us about the nature of Nature (Aristotle’s quest).

Philosophers have singled out logic and the analyses of arguments as their primary tools. Logic has been used to analyze the consistency of existing arguments, or of fragments of an argument, especially beliefs widely held by others. It is used to show where an argument would lead to if it were pursued rigorously, or to demonstrate that a particular argument may be itself be based on empirically false premises. For centuries there has been an understanding that sooner or later errors in logical derivations from premises would surface sooner or later, and that this would automatically lead to the rejection of the argument as a whole! This has happened occasionally, but not consistently or always. More recently there have been discoveries of a contradiction which had remained undiscovered but that the argument had terminated too early for the discovery of such contradictions to be made. This potential fault line may have been dealt with by translating any argument into a mathematical form and testing it with the help of high-speed computers. The results to date have only shown that all things considered that the chances of identifying a contradiction are disturbingly high. It means, in effect, that we cannot guarantee — as was initially required — that an argument is logically faultless and was impervious to contradiction. Yet without this the aim of a logical analysis cannot be guaranteed that it is itself faultless — that is represents an unbroken line from given premises to conclusion. A conjecture can be correct even if an argument to support it is faulty.

Many early philosophers, unlike priests, were not inclined to employ arguments to support a viewpoint for which they could not find independent support: their task was being primarily critical. One states the premises and then works out the implications. The model was that an argument starts with some widely held and unchallenged conclusion — e.g., “eating pork is bad for your health” — and then proceeds to demonstrate that the conclusion has been reached by following authorized logical procedures. In this respect philosophers have acted more like teachers and sages than defenders of an official faith — a habilitated-belief — something which might set them on a collision course with a viewpoint of a powerful “establishment”, where official views were backed by an enforcement agency. Habilitated-beliefs are a new concept and will be discussed in a separate blog.

(for Tim)

How To Map a New Word

Neologism

In a previous blog I suggested that any new English words, or neologisms, could be submitted to a computer search of the digitalized English literature, say from Beowulf onward, in order to discover whether the term of interest had previously been used. In what manner was its earlier use, its context of use, different from what is now proposed?

New words are often proposed as replacements for a current expression on the understanding that this neologism would be accompanied by clear guidelines for its use. Such guidelines are also referred to a definitions, or re-definitions. What was earlier called “a temporary bunch of words” may now qualify as a new single word. Its meaning would then be viewed as the area partially covered by each of the words originally tied together to form a bunch or an expression.

The following example may help: The first letters of each of the following words,“Dependable, Redoubtable, Unimpeachable” spells *dru*. It is a new term. Objects like trees would be automatically excluded as being “inappropriate”, whereas one could for example say, “John Dewey is a dru person”. It would give a reasonably clear image of the kind of person this great American philosopher was! (Of course, the statement may be regarded as a good or a poor description of the person.)

However, a composite word like *dru* should not be viewed (as was done formerly) as a one-dimensional overlap of qualities, like a series of circles which overlap a common area, but as covering a meeting point in multidimensional space, which may also extend over time, in which case one should state the temporal parameters. This is what “mapping a word” is all about.

Conjectures and Neologisms

We are living at a time when we are frequently asked to transcend the limits of conjecture. *Be inventive*, *be creative*, *stretch the envelope* are expressions widely used to refer to this. More and more of our thinking is directed towards situations which need to be described in terms of sequences, or as involving successive different processes, rather than as individual (hence stable) events, frozen in time like pottery in a Victorian display case.

By contrast, many early Greek thinkers — often mentioned as founders of our philosophical tradition — espoused the view that time and change are unreal, that there is indeed a real world from which process and progress are excluded, and belong to a chimaeral world (see Plato’s discussions of these issues in Timaeus).

In our own time the more common view runs in the opposite direction: it is suggested that we are the agents (the guilty party, as it were!) that freeze events. By doing so, we create a notion of change which in turn requires us to invent agents of change. We invent causes when we feel trapped, without explanations for events, and do so in order to account for our discomfort. Not to have an explanation is experienced by many as a deficiency, whereas a process of reification, whereby we impose stability and structure on a world, is often viewed as living in a predictable world! We invent and stipulate (conjecture?) processes which give flesh and bones to events, and often create homunculi with great powers to lift and shift events “out of their orbit” (a pre-Newtonian concept).

Indeed, Western philosophy — under which I include what some Greeks thinkers referred to as *natural philosophy*, or the study of natural phenomena, is haunted by the image of two worlds: a world of nature, which obeys and follows its own eternal rules (discoverable by us), but also a world made by us, one which is mostly beyond and unaffected by naturalistic rules, which are commanded by what Gilbert Ryle referred to as “ghosts in machines”. This world supposedly lacks universal rules but develops from emerging trends, is modish, unpredictable yet yields some of its secrets post-hoc, when we reflect on our past. There is an ever-growing literature which interprets the work of some of our major artists (past and present), a trend which is most likely to continue for the foreseeable future, even by our descendants when living in outer space, off-earth.

The picture is confused but may become more coherent during future discussions, and in step with an increase in our understanding of how human-thinking emerged from simple interacting neural networks to the complex storage and processing organ it has become, whose own limits of growth (internal or external) and capabilities are at present unknown. (Robots could be viewed as external drives, extensions to the living brain.)

There are few (if any) natural phenomena of which it can be said that these remain unaffected throughout the passage of time, or the procession of events. On the contrary: the question is to estimate to what extent events have already changed, although the names of these events have been retained, and to estimate to what extent these events are likely to be transformed in future. Some events appear to remain unchanged over time, whereas others transform. The current debate about *climate change* is an example. To cite a different example: *The Battle of Waterloo* is viewed as a stable event, although writers disagree about what happened on the battlefield, and disagree about details. The Battle of Waterloo is a historical concept, but what is discussed amongst historians are features of this event, not whether the event occurred.

To illustrate the difference between a concept and its meaning I have chosen the term *human family* which serves as the name of a phenomenon but which is also recognized by those working in the area of human relationships and institutions as a moveable feast, something which has changed throughout the course of our history.

The Human Family

*Family* is the name given to a common feature of all human societies. It is a concept which represents an event which has temporal as well as structural and functional properties. The task of any writer/reporter is to create a portrait of the family which permits readers to analyze the relationship between members. A society may prescribe what is permissible or not to those included within a family in contrast to those external to it, e.g., whether members within a family can marry, or whether marriage must necessarily be endogenous. Whatever the rules, these can change and the conditions under which such change occur would then be viewed as factors influencing family structure.

It is important that a structure of an event is correctly portrayed, that it is attributed to an event which occurred as stated earlier. The birth of a male or female child is celebrated differently in most societies and is also influenced by the order of birth — both are structural factors. To what extent does birth order play a role in determining the future of a male? Which son of a large landowner is likely to be encouraged to follow a career in the Church? (Answer: probably the third in Britain throughout many centuries.)

Birth order is a temporal factor whereas male/female is functional, that is, determines what roles will most likely be assigned to a person and when. When? The passage of time is viewed as an independent factor, not as something doled out as fleeting timeless moments, but more like a ceaseless conveyer belt. The term *moments* therefore carries with it interpretative problems, as indeed have such terms as *childhood*.

Admittedly, the above is vague. We do not normally take an arbitrary selection of words, words which are unrelated, then stitch these together: our selection is more orderly, more contrived. What is clear however is that humans appear to be continuously engaged in extending their language, to stretch the limits of what they already have. It is their response to current prevailing circumstances, to being members of a community which appears to seek and build new environments to inhabit, which secures and preserve their existence, extends their survival rather than abandons these. We need to remind ourselves how relatively short has been the past of our species measured against the estimated life of our planet and solar system and how minute has been what we often refer to as “the life of the mind” and how fragile are the conditions which sustain our species.

We introduce new words with increasing frequency. Neologisms may be viewed as transformative tools which in the past have extended our control over many but not every discernible feature of our world. There is of course no guarantee that such creative actions can continue unabated as has happened in our recent past. Our creativity has also produced conditions which threaten our continued existence. Other species have become extinct although (as far as we know) inadvertently, not through self-destruction. Many species have lost control over their environmental niche. Humans, however, have gotten perilously close to doing so, and many now claim that we have interfered with environmental factors to an irreversible extent so that the earth will be unable to support human life.

The meanings of many words are unquestionably related to their effectiveness in identifying events but there is an additional dimension which is related to the historical context of their use, the role a word plays in mapping the world for its current users. Such referential words, new or old, help to define the contours as well as the interior features of our culture, something which applies even to those words which seemingly are entirely referential.

Research Episodes

In an recent blog I commented that:

…we continue to be committed to the idea of extending our current knowledge. For this to happen we should be willing to add but also to abandon ideas. This requires that some old ideas, no matter how venerable or favoured, get replaced. The criteria would be that the replacement-ideas are expected to do a better job of explaining what we call our “current raw data”, that is, materials previously gathered and collected during “research” episodes, but which have not yet been methodically and systematically processed and sorted.

There are several ideas here which merit further discussion.

Foremost is the notion of a *research episode*, which I view as a prolonged and systematic inquiry into one or more well-articulated problems, and where each problem studied relates to some earlier research. There are many examples which could be cited, for instance the many cited in Hawkings A Brief History of Time (1988), but my own research was heavily influenced by two newcomers in the early 1950’s, by ethology (a form of studying animal behaviour: Niko Tinbergen, A Study of Instinct, 1950) and the study of operant behaviour as advocated by B.F. Skinner in his book The Behaviour of Organisms (1938).

I often met Tinbergen on his regular visits to lunch with my friend and colleague B.M. Foss at Birkbeck College; I met Skinner in 1951 in Sweden at the International Congress of Psychology and thereafter every few years. We stayed in contact for the next 35 years. Both had founded new schools of research which reached far beyond Oxford and Harvard and each gave birth to distinct “schools” of thought which led to significant research efforts by others throughout the world and which expanded into fields of study other than the “Herring Gull” or the pigeon in its “Skinner box” pecking at discs. Both men deeply influenced the way we in the 20th century thought about our world.

I use the term *research episode* in a wide sense, as not confined to a short period of time, or as associated with a particular individual, but as a period within an existing science which may develop considerable momentum as new problems are explored by an increasing number of investigators (often on a cross disciplinary basis). The methods and ways of thinking about problems is influenced as new frontiers of inquiry are reached and breached. Such episodes may start as a distinct, even limited form of inquiry, and then may expand either slowly or rapidly to cover more and more “problem areas” as also inadvertently “invade” other territories.

The ethology of Tinbergen, or “instinct theory” as it was often referred to in its early days, had a profound impact on comparative neurophysiology — and continues to influence it. It extended and dated the earlier concepts of Pavlovian neurophysiology which had started almost three quarter of a century earlier. Pavlov’s thinking itself was influenced by the notion that the nervous system was a direct extension of the reflex-arc and was influenced by the idea that all neurological systems were built on a similar, closely related architecture. Differences were attributed to levels of complexity and eschewed the idea that levels of complexity could be the source of irreconcilable differences in nature itself.

What is research? The term *research* is well established. In English it comes from the verb “to search”, to look into and to look for. It covers trivial efforts — like the birth date of a favorite composer or author — to issues which require prolonged investigation, e.g. how honeybees return to their hives after foraging, or informing other bees on their return from a location of a flower patch recently visited. Doing research invariably involves that one identifies a specific problem or set of problems and follows each of these to the point when most central questions seem satisfactorily answered.

In practise the original issues which first aroused one’s interest become modified en passant, are reinterpreted and as a result of such reinterpretations the conceptual net often becomes larger. It seems that two separate tasks are involved in research: the first requires much skill in asking questions. This has to be learnt and is skill honed through experience. One has to learn how to ask the right questions, something which nay require a long apprenticeship. The second requires that one learn how to move from translating a question — however it was initially stated — into a method of discovery, a method of enquiry.

The first example refers to something done quickly, in a jiffy so to speak! Today all one needs is a computer with Internet access and the know-how about how to search for answers in Wikipedia or similar sites. Most kids in my neighbourhood know how to do this. Some are wizards at this even at a tender age! No need for them to know anything more than how to approach a computer and ask questions, or so it seems. No need to memorize answers when it is so easy to access the memory of a computer! The “search” episode can therefore be very short, whereas understanding answers discovered may take long! It is different with questions about how honeybees communicate the direction and distance from hive to food source and then return! Do bees learn by their mistakes — like we do — or is there little tolerance for those who pass on misinformation to their hive-mates? Furthermore many questions cannot be answered by referring to the work of one’s predecessors. One enters the forrest alone, without companions, and with luck or skill exits at the other side.

Every doctoral dissertation supposedly consists of a new contribution to knowledge. New? The true story is that one asks questions which invariably lean on the work of others. Of course, one may lean on a house of cards or neglect the work of unknown predecessors. One may avoid errors by acquiring extensive knowledge of the history of a problem, yet errors and ommissions are unavoidable, although one can learn to reduce these in time.

Yet asking questions such as those already mentioned take place in a context. Broadly speaking the context is the culture of the petitioner(s). Although each question follows a string of earlier questions, the sequence is not necessarily orderly. The logic also is not rigid but is often a heavy mixture of materials drawn from earlier periods which themselves are infused with analogical materials, like what if all animals are like the branches of a tree, a common trunk from ground to sky, which branch out in familiar fashion? There is also often some element of “serendipity” which helps to uncover clues en passant — often rather unexpectedly.

These clues can dramatically change the order of discoveries made. Wrong leads are familiar to most experienced researchers. However orderly sequences do occur, as during conversations between like-minded people, or when one person instructs another in a teacher-pupil relationship. One guides the other. Conversations between colleagues also keep a discussion on track and encourages each discussant to follow implications of their thoughts. Some discussions are guided by appointed chairpersons, other move along and therefore have less structure, but may nevertheless reach comparable conclusions.

Conference organizers often try to follow this model. Left on their own most people — even disciplined, somewhat compulsive and single minded professors — “skip” from topic to topic without raising questions in a coherent manner, as if questions can be peeled layer by layer like the wrappings of a Christmas present, no matter where you start! The more wrappings the greater the excitement! Ultimately the core is exposed.

*Culture* is a flexible concept. Applied to a modern community it covers the idea of a mix of micro and macro cultures. But there is a significant difference between a group — viewed as an aggregate of individuals — and a culture. A culture involves a group of individuals, marks them as belonging together by virtue of common interests, not common physical markings. What is it that individuals prefer, what draws them to each other, what holds them together over time despite diversity of experience, physical dissimilarities? Those who are devoted piano players of Mozart or attend exhibitions of Picasso are already on board — as it were — and have cultural affinities. Whatever binds their interests and commitments may be limited, but forms a common ground.

In time, three men in a boat will form a community, functional or dysfunctional. In short, although there may be significant differences in the affiliations of individuals who form a group — the Thursday evening concert goers, say — these serve as the bricks from which a modest dwelling can be built. Thus individuals are viewed also as a member of a smaller community whereas none are likely to be members of all groups which make up the society as a whole.

What about “new arrivals” i.e. immigrants? These go through an acculturation period and process which can vary from one generation to another. At first each is reared as members of several small social groups, but this changes so that mature adults often become members of several quite distinct groups with interests and interactions shared some, but not all, of their time.

Take a standard example of how we may come to get involved in a problem and in attempts to find its solution. The problem may be complex, may not have a single solution but be a multiple problem with solutions for one but not for all aspects of the original problem. “Why did the hen cross the road?” This event happens all the time in country lanes, but never — as far as I know — on Bloor Street in Toronto, or Hyde Park Corner in London. What catches our attention and arouses our curiosity most often are what to us are unusual happenings: hens crossing city roads being one.

Take another example: I visit a learned friend’s home for the first time and note that his opulent library is arranged with books placed on the shelves in order of size, not colour, not content, not alphabetically or thematically. My initial shock turns into curiosity. Why do it that way? I sense a problem and I rummage for ideas I have had about organizing my own library, about what we know about the psychology of collectors, about library science. I do so for two related reasons: I wish to explain to myself what I have seen and perhaps share my explanation with friends and colleagues! There is a leap from individual perplexity — based on personal ideas about what is normal and what people do routinely — to an awareness of a general problem, that my problems are prototypical of those of others.

This general problem can be expressed in the following manner: what leads people to organize their phenomenological experiences into categories, and what consequences follow from adopting a “grouping routine” developed by an individual and by a group of cohorts?

In both examples the initial question represents the first step to what could turn out to be a long series of successive steps. Each answer is likely to lead to additional questions, then to more enquiries. Had I asked a pedestrian question, like who designed and built the St Paul’s cathedral in London, an answer would be available readily, by consulting an on-line (internet) encyclopedia. To help distinguish between these two types of inquiries it is fitting that we give each an appropriate name. I suggest that the term “research episode” be used for those many cases where the answer to a question (a) is not already readily available; or (b) where the search for an answer to a question requires that one pursues several different alternative hypotheses, which developed during the search. Some of these hypotheses will be rejected but others may serve as stepping stones, or toeholds, to additional answers and wider, perhaps newer areas of research.

I believe that formal concept of a “research episode” is new. It is categorical — not canonical — the sense that the concept helps us to organize what is already known independently, prior to the application of the category to the material. These categorical concepts may in time be elevated and become canonical, that is, become part of an established religion! An example may help: suppose you are given a 5000-piece jigsaw puzzle and several possible blueprints? One way of tackling this frustrating task is to conjecture an idea of what it is — a painting by Picasso or Turner perhaps — or work on an entirely different presupposition, namely that the puzzle will form a square or oblong picture, or perhaps a round or oval one. On what basis are thee suppositions made? What clues were used, if any? If one were told in advance the identity of the painter, or perhaps the topic of the painting, the task would be easier. (Note: we rarely enter such tasks naked; we usually get a chance to prepare ourselves — and this illustrates the importance of approaching any task with some preparation and about what is likely to happen once we start our journey of exploration.)

Suppose you find only 100 pieces of a puzzle. If told that the completed puzzle is a rectangular picture you that you need only 4 right angles pieces to form the corners. So the chances of an error in detecting a corner pieces are now 1:25, better by far than 1:5000 ! “Detect corner pieces” and “detect those right-angled pieces which define a corner” are procedural imperatives which are categorical, and may lead to the solution of the task. But if the picture is oval? Heaven help you — you will have to start by gathering together pieces by colour matching.

ELI5 — Explain Like I’m 5

Until last week I did not know the meaning of the acronym ELI5 — “explain like I’m five.” I’m convinced that this request sets the hurdles very high for me but the goal is worth it. Five-year-olds are actually past the time when their main interaction with adult is “why?!”. If they do not understand an adult at that age they may just blink and walk away, or wrinkle their nose.

eli5I shall therefore compose several pieces in the ELI5 mode. Hopefully adult readers won’t walk away. Please score me 1-10 — the higher the score, the less I have succeeded in stating my case. Give me a 1 if I have succeeded beyond my wildest dreams, and 10 if I have failed miserably, or anything in between.

What does *explain* itself mean? Why is “why” asked by children or adults? What is expected of me, who is asked that question? Should pause,estimate the age of the questioner and then proceed. My reply should be adjusted to my guess of the age of my interlocutors.

A five-year-old does not ask only to annoy because he/she knows it teases! The question is not asked to gain an adult’s attention or to get him/her away from whatever they are currently doing, like reading a magazine or watching a TV program which is likely to be boring to smart kids. I think it is asked for several reasons. Thus, the kid may not yet be aware that one need to be very specific in how to formulate a question when asking adults, that it is different from asking other kids the same question. Kids do what adults also do: they ask questions in contexts and therefore assume that the person addressed can fill in all the blanks omitted.

Take the case of dropping a glass of water. The glass shatters. WHY? EXPLAIN! The question is probably not what an adult would ask: what was it about the glass or the floor which shattered the glass. Was the glass empty, half full, full? Was the floor carpeted? Wood? Cement? Were my hands wet, greasy? Was I inebriated, or do I suffer from palsy? None of this information would be helpful to the child! Say “the glass was wet and slippery” — it may be sufficient. Now wait for the next “why”! It will however be a different question — the continuation of a social encounter.

The Writing of History

Writers write for audiences which are presumably well defined in their minds. Their manuscripts are narratives addressed to their imaginary circle of friends and admirers who they entertain and cajole by the twists and turns of their extended tales.

Historians, on the other hand, have three audiences, and they address their comments to each in turn.

historians

First, each historian speaks to his predecessors, corrects them where deemed necessary because of their inadvertent exclusions or misinterpretations. If exclusions were “inexcusable” because the data was readily available to them with a little additional effort, it is likely that the culprit will receive a shellacking and be condemned. More often, however, exclusions are due to the stark fact that new data appeared and filled in details previously missing, rather than force continuity upon earlier writers who had relied on their historical imagination, who substituted conjecture for missing data.

The second audience of a historian are his/her contemporaries, many of whom are assumed to be already familiar with the episodes to be discussed or the main characters of the narratives. Often it happens that the themes discussed are “modish”, are driven by contemporary problems. What roles did “spies” play in earlier periods? Did they exist at all and what credence was placed on their testimonies? Recently there has been a spate of books and television programmes on Henry VIII and his chequered times, yet few readers or viewers will be familiar with the life and times of many of its “minor” characters, including its spies, who nevertheless contributed to the “story-line”. Christopher Marlow, Antony Bacon or Sir Francis Walsingham were all actors engaged in spying, but the first two had minor parts whereas Walsingham was prominent as a defender of Protestantism and of English/Scottish independence from European hegemony, and had his “men” placed in several capitals of Europe.

The third audience a historian has in mind when writing is less palpable: unborn generations of readers who view their received legacy through the eyes of past writers of note. The historian see himself/herself as someone who may influence the future by providing an interpretation and a record of this past. Their influence may be minor or, on the model of Gibbons amongst others, be writers of considerable significance. It may be short-term effect — as in the case of H.G. Wells — or long-term, as Julius Caesar or Cicero have been. Generations of British leaders were schooled in both these classical authors during their school-days and as university students.

I am sure some writers do not recognize themselves as aspiring to such lofty heights but view themselves as journeymen, not prophets. All honour to such writers and purveyors of “truths”. Let us agree, furthermore, that historians are ill-advised to see themselves as politicians, as influencing by their writings or teaching the distant future, except indirectly. For the truth is that historians are ill suited to that task and — generally speaking — they are more honourable in their intent than most politicians. The interpretation of the past may influence our future actions as individuals or as communities, but how this plays out remains a puzzle and is best left to future historians to discern for each historical period.

Final Theories for Ultimate Problem

Part of the series, “Thoughts about a Final Theory of the World”

Theories are only produced by humans. This may change in future, during the age of robotics and AI (artificial intelligence).

Each theory consists of a set of ideas about things and matters, i.e. is about something. Theories have content. Theories need not cohere, may conflict mildly or may even be irreconcilable. Many are. This is how it has been in the past and is likely to continue to be in future. Some writers have regretted that this is so, and have tried to reconcile conflicting theories, usually without success. I suspect it is a trend that will continue, but may not be more successful than previously.

We don’t know who created our first theories, when this happened, or what prompted them. It is like asking about the origin of the mind. Since there are no written records older than c.5000 years, we can only surmise what early theories were like and what was covered by them. Presumably such theories were very elementary, and far too general. The earliest written records refer to management/administrative topics, not to matters of literary, ethical, or religious interest, and not to what today would be referred to as concerned with problems of knowledge, specifically about what we know and how we have gotten to know about ourselves or related matters.

Pictorial records go back further than written records, about 40,000 years, whereas the age of homo sapiens has been estimated to be close to 200,000 years, and could be even older. Related species, like homo neanderthalensis, are older. In summary, specific knowledge about our own origins continues to be sparse, although our knowledge about more general matters is relatively substantial and falls under the rubric of cosmology. “Man, know thyself” remains a prescription.

The current rate of the growth of knowledge is prodigious. In the nature of the case, we cannot know who, when, and where a final theory will be formulated, if ever: it depends among other things on whether humans continue to be around, whether they will self-destruct, become extinct, or whether perhaps humans move to another planet and/or become a new super-human species.

Perhaps humans will invent devices which will replicate what we currently call creating ideas. The eminent physicist and writer Stephen Weinberg has written about the idea of a final theory — without claiming however, as some of his physicist predecessors did, including Newton and Einstein — that we were close to developing a theory which would “explain everything”, and in this limited sense would also be “final” (S. Weinberg, Dreams of a Final Theory, 1992. Pantheon Books, NY)

For several reasons, I think this is unlikely to happen. How would we actually know which of a number of entries to a competition to create a final theory will be “final”? A sounder reason is that theories contain speculations about an admittedly finite universe, beyond whose current limits we, as theorists, continuously aim to expand. In other words, those who speculate beyond what is known at present thereby declare that there is no final theory. In this manner we could fashion a universe which is beyond our immediate knowledge. A final theory would simply curtail further speculations — which is not the aim of the game of theory-construction.

It would be fairer to state that we continue to be committed to the idea of extending our current knowledge. For this to happen we should be willing to both add and to abandon ideas. This requires that selected old ideas, no matter how venerable or favoured, get replaced. The criteria would be that the replacement ideas are expected to do a better job of explaining what we call our “extant raw data”, that is, materials previously gathered and collected during “research” episodes, but which have not yet been methodically and systematically processed and sorted.

Given that our research methods are highy varied, are growing prodigiously, and that our methods are often unusual and “esoteric” — like when we use sub-atomic particle colliders like CERN — the data produced is correspondingly difficult to understand by the average person, unschooled in current techniques. Ordinary people don’t have the time or training to keep up with developments in the increasing number of different disciplines which have found a place in our universities and research institutions. Many have difficulty keeping abreast of developments in their own spheres (bubbles?) of interest.

Most of us therefore follow a path of least resistance, the path also taken by most of our predecessors, and leave those things which are beyond our own understanding to latter-day experts. Not priests (God forbid!) but people steeped in their realms of knowledge. Even if all experts in different domains agree amongst themselves, it is possible that the next expert, perhaps from some new domain, will not. This reality excludes finality — but also ensures continued progress in our expanding spheres of study.

Good Theories?

Not all flights of fancy, old or new, qualify as “good theories”, or as constructive proposals. There is however much at stake in producing better theories than those currently available. So-called “good theories”, including those which address social and practical problems and which may have a reasonable chance of yielding outcomes that meet some of our common needs, meet our ethical standards and social aspirations, but exclude self-immolation and the destruction of our species (and with it our civilizations).

I plan to add additional comments on these topics shortly. There is much to write about!

The Universe as a Mystical Concept

The universe is a mystical concept. It is not the name of a fully known object, but refers to something which by definition cannot be known in the concrete.

Counter-examples: My dog Fifi and my cat Cleopatra are two household pets, each with distinct appearances and ordinary as well as some unusual habits. I can give lots of descriptions of them which will make them familiar, even recognizable to others. I could include a photograph of each and even an odometer-reading! Furthermore their existence is not in doubt, although each has problems articulating their own existence and neither can say “cogito ergo sum” or its equivalent in dog or cat language (of course I would not understand either).

What makes the universe such a problematic concept? I have called it in fact a “mystical concept”. The concept seems to be a member of a species of names of things, to which new instances can be added without necessarily changing the concept. The concept is therefore open-ended like talking about “all humans” even though additional humans are added as I write, each one presumably different from others. What does it mean then to have a “Theory of the Universe” if it cannot be confined and contrasted with other events? The answer is that a “Theory of the Universe” is an inadequate manner of stating three entirely different problems:

  1. If the term *universe* refers to all that we known about the universe so far, but not what else could be discovered about it tomorrow. If you cannot complete the list you do not have a comprehensive concept! Adding “etc., etc.” or “ad infinitum” won’t do.
  2. If we examine the methods and means we have used to specify the universe, how can we be sure that there are not other methods that could have been used to draw up and codify its features. We have a nasty suspicion that other methods could be deployed or invented.
  3. If we can formulate an explanatory theory for all we already know about the universe, what would a replacement theory to this have to be like to be a candidate to replace the reigning theory? What features would it have? What tests would the theory have to face to continue as an unchallenged, non-contentious theory?

If we accept that problems 1-3 cannot be overcome this may be because we are dealing with a mystical concept and not with a concept which covers a particular field.

In regard to #3, once we have discovered that our reigning “Theory of the Universe” has generated contradictions which have to be removed, that we may have already discovered additional matters about our universe which confound our earlier conceptions. Only theories which make provision for all contingencies are resistant to fault-finding, and these would include the presence of mystical entities and processes, like, respectively, “the universe” and “create”.

Finale: The world as I know it is not a mystical concept but contains a welcome admission that I do not, and cannot, know everything. Furthermore I cannot know all things equally well, but I can be incredibly ignorant about many things. It is a sad fact that the expression “abysmal lack of knowledge” has meaning and much application.

Factoids – Old and New

This is part of a series called “Questions of Facts”. Click here to see the first.

The term *factoid* was coined by Norman Mailer to express the idea that many things we believe to be true — and which normally go unchallenged — are products of the public media (radio, TV, newspapers, advertisers). Factoids are therefore statements made by the media and are about states of affairs. The statements are supposedly correct, or largely true. The media who circulate these may have other motives than keeping the public well-informed. According to Norman Mailer, they often (too often) pervert truth and do so for entirely self-serving reasons.

The beneficiary of these so-called news items is usually someone who is selling a product or a service which supposedly “deals” with the problem identified by the factoid. Example: a news item appears which creates a rumor that our waters are being poisoned and are therefore unsafe to drink. This message creates a demand for bottled pure waters! The proposed remedy hardly deals with the problem of how a healthy fresh water supply can be ensured.

Such news items are often couched in convincing words, designed to persuade everyone to accept what they hear or read as being true and trustworthy statements. We — the public — are being “had”!

The word *factoid* itself is a fusion of two sources: the word fact, which derives from the Latin participle factum — “made” — and the Greek ending -oid, meaning “to be like.” *Factoid* therefore refers to a statement which is fact-like, but not a fact. It is faulted as a statement of fact, and therefore not true in the sense that facts supposedly are.

Let’s look at the logic of the term *fact*. It is generally assumed that a fact asserts something that is true. But clearly not all statements are truth-carriers. “To be or not to be” is the expression of a quandary. “Two eggs, sunny-side up” is part of a request. “Two plus two makes four” is arithmetic and follows a rule of logic. “I am a man” is a declaration. “I am feeling happy as a lark” is evocative and an analogy.

But “Mary Wollstonecraft wrote the novel Frankenstein” is a claim to fact — and as far as I know is a justified statement, whereas the claim “Bernard Shaw wrote Lives and Loves of Richard Wagner” is a statement of fact which has not been justified — and is false. He did write The Perfect Wagnerite, which neither deals with Wagner as a person nor is it an adoration of the writer/composer, but an amusing critical appraisal of his music-dramas.

Statements of fact — or factual statements — are always claims made by someone or by a group of persons, and which may result in one of these outcomes:

  1. The statement is justified by the evidence submitted in support of the claim made, or
  2. The claim, as stated, cannot be supported.

But what about claims whose support base has fallen into quick-sand over time — as often happens? Remember the thesis that the earth is stationary relative to its sun, or that light travels through an ether. These two statements are today regarded as “has-been facts.” The evidence for them eroded over time for reasons which can be stated. These represent cases of justified corrections, or as Popper used to say, of falsifications.

Ordinary statements, regarding every day matters, often get corrected in the light of new “revelations”, as when we correct the view widely held that all swans are white. There are to-day an estimated 500,000 black swans worldwide — and they are not an endangered species.

However, statements of fact are either true or false: if true, we continue to call them facts; if false, we discard them, perhaps change their label from *true* to *false*, or employ some other strategy to indicate that what was once thought was a statement of truth is now highly questionable and should be rephrased in a way which preserves its “kernel of truth” or declares these to be outright “fallen”.

I reserve the term *factoid* for a “fallen fact” or, to be kind, for a “retired fact,” but do not call these instances deliberate lies. Honor the fallen, understand why they have suffered their fate, but don’t throw them dishonored onto a dung heap. One can learn much from mistakes honestly made.