We speak as if the phrase “the facts” or “the totality of facts” suggest that there is (exists) such an entity as a *totality of facts* on the analogy of “all the king’s men”, not only “some”. There is no evidence to support this assumption. Once upon a time there were registers of men who served in the King’s army, but there has been nothing comparable for the kingdom of facts. However, once upon a time there was such a number of facts in a specific country during a definable period, during say the reign of James II, or Queen Victoria!

We should therefore think of a *domain* — rather than a collection — of facts, namely all those matters which could be assembled to reflect a present (current, or time-constrained) set of certainties, of strongly confirmed knowledge-claims which we could defend both logically and methodologically at a specific time. There are probably less of these than were once thought!

The expression “totality of facts” therefore refers to a category which has a limited membership compared to a category which is explicitly stated to be *without* limits, i.e. limitless. My reference it to “a flock of sheep in the field”: the field itself is bounded so it follows that the number of sheep are only those within that field. To talk about “countless sheep grazing in this field” would be an exaggeration since under suitable condition one could count these. At the end of the count there would be a *finite* number. The phrase “a countless number” therefore means that the number *cannot* be counted! But, we can also miscount. To over-count would be classed as an error of calculation. Errors don’t count.

A category which has no limited membership — and therefore is without restrictions — is like referring to “children yet unborn”. The phrase suggests that there are members which make-up this category, which belong to it, yet clearly it is only an *estimate* of a number. Whatever number is then submitted is therefore only *conjectured* and is not based on an actual count of instances — which is what was wanted! In short, there are things which can be counted up to an *agreed (finite) limit*, e.g. the current number of toothless men in Uruguay. But there are also entities (open categories) which defy such treatment, namely, the number of adults in Brazil who *will* die of apoplexy — which is an *estimate*.

What to do with the widely used expression “the totality of facts”? Should we agree to abandon it from ordinary use on grounds that the expression inevitably misleads, or that there are too many cases when an arithmetic total cannot be gotten? We could substitute something which gives the *flavor* of the expression, for example, “The sum of the evidence suggests…” or “In general the majority of cases indicate that…”, or “It seem highly likely that…” — that is, change a categorical statement to a probabilistic one.

It seems therefore that to refer to “totals” and to “totalities” is very often most legitimate but — not surprisingly — *only* in special cases whose character will need to be defined. In short, it is up to us to use these in a manner which could eliminate unnecessary arguments.

This is a wonderful avenue of exploration. If I can, I’d like to add another perspective to what you say. It is the idea that facts can fail, be reinterpreted, or be refined. The last idea – refinement – that is interesting to me. For example, let’s say we know the speed of light to 10 decimal places. The speed is therefore a fact which is important in the construction of theories such as special and general relativity. What happens, then, when the estimate of the SOL becomes known to 11 decimal places? The refinement may leave the underlying theory the same, although we now have greater accuracy to our theory. But it can (and often does) force us to alter the theory in some way. There are an uncountable (i.e., the second level of infinity) number of such refinements possible for this one universal constant. Given that most of these would not be consistent with extant theory, it implies an uncountable number of theory refinements and hence a lack of totality.

Let’s consider this one point WRT the Popperian proposal that falsifiability is a critical distinction of theoretical validity. Here’s the thing: the chance of the theory predicting the 11th decimal place is 10% so the theory is falsifiable and, if proven true, means it has overcome a 1 in 10 chance of success. But the chance of any theory accurately predicting an uncountable infinity of these refinements (there are an infinite number of decimal places!) is precisely, mathematically 0. In other words, every theory is logically false if it relies on facts (certainly if the facts involve numbers from the real number line, although I imagine this argument could be extended in many other ways.). What’s more, we can never test it to the infinite accuracy required.

Of course, a theory is just a model of reality, not reality itself. In other words, it is always a proposal and should be taken as such. As this simple thought experiment has shown, it can logically never be more than that and, therefore, never be the totality of facts.