www.TheLogician.net © Avi Sion - all rights reserved
© Avi Sion
All rights reserved
Avi Sion, 1990 (Rev. ed. 1996) All rights reserved.
CHAPTER 46. ADDUCTION.
Induction, in the widest sense, is concerned with finding the probable
implications of theses. Deduction may then be viewed as the ideal or limiting
case of induction, when the probability is maximal or 100%, so that the
conclusion is necessary. In a narrower sense, induction concerns all
probabilities below necessity, when a deductive inference is not feasible.
All this refers to logical probability. A thesis is logically possible if
there is some chance, any chance, of it being found true, rather than false.
'Probability' signifies more defined possibility, to degrees of possibility, as
Thus, we understand that low probability means fewer chances of truth as
against falsehood; high probability signifies greater chances of such outcome;
even probability implies that the chances are equal. High and low probability
are also called probability (in a narrower sense) and improbability (with the
im- prefix suggesting 'not very'), respectively. Necessity and impossibility are
then the utter extremes of probability and improbability, respectively.
There are levels of possibility, delimited by the context, the logical
environment. This can be said even with regard even to formal propositions.
Taken by itself, any proposition of (say) the form 'S is P', is possible. But,
for instance, in the given context 'S is M and M is P', that proposition becomes
(relatively) necessary: its level of possibility has been formally raised.
Alternatively, in the given context 'S is M and M is not P', that proposition
becomes (relatively) impossible: its level of possibility has been formally
The same applies with specific contents. At first sight, every statement
about anything seems logically 'possible'. This just means that the form is
acceptable, there exist other contents for it of known value a well-guarded
stamp of approval.
As we analyze it further, however, we find the statement tending either
toward truth or toward falsehood. We express this judgement by introducing a
modality of probability into the statement. We place the statement in a logical
continuum from nil to total credibility.
In any case, we know from experience that such probabilities are rarely
permanent. They may increase or decrease; they may first rise, then decline,
then rise again. They vary with context changes. Keeping track of these
probabilities is the function of induction. For example, when a contradiction
arises between two or more propositions, they are all put in doubt somewhat, and
their negations are all raised in our esteem to some extent, until we can
pinpoint the fault more precisely.
In the chapter on credibility, we described degrees of credibility as
impressions seemingly immediately apparent in any phenomenon. Thus, credibility
is a point-blank, intuitive notion. In the chapter on logical modality, on the
other hand, we showed that the definitions of unspecific plural modalities
coerced us into the definition of logical probabilities with reference to a
majority or minority of contexts. Thus, knowledge of logical probability
presupposes a certain effort and sophistication of thought, a greater awareness
Here, we must inquire into the relation between credibility and logical
Every proposition has, ab-initio,
some credibility, if only by virtue of our being able to formulate
it with any meaning. This intuitive credibility is undifferentiated, in the
sense that, so long as it is unchallenged, it is virtually, effectively, total.
But at the same time, this credibility is not very informative or decisive,
because the opposite thesis may have been ignored or may be found to have equal
As we begin to consider the proposition in its immediate context, and we
find contradictions (or even sense some unspecified cause for doubt), the
credibility becomes more comparative, and it is certified or annulled, or seen
as more or less than extreme one way or the other, or as problematic (equally
As our perspective is broadened, and we project changes in context, the
problematic credibilities become more qualified that is, they are quantified
by some specific logical probability, so that they shift more decidedly in
either direction. Thus, problemacy (median credibility) may be viewed as the
very minimum, the beginning, of probability.
In this way, all the plural logical modalities may be viewed as
'filtering down' to the single-context level of truth or falsehood. This
transmission of modality, from the high level of many-contexts to the low level
of the present context, may be immediately apparent (as in the case of
necessities and impossibilities), or may gradually develop over time (as with
all contingent probabilities).
As probabilities vary, through new inputs of raw data into the actual
context, so that more alternative contexts are imaginable, and through closer
scrutiny of available data the credibilities under their influence also and
Logical probability, as formally defined, is impossible to know with
finality. The exception is in the extreme cases of logical necessity or
impossibility, which can be known even without access to all conceivable
contexts, through the one-time discovery of self-evidence or self-contradiction
(in paradoxical propositions); these modalities are permanent.
But in all cases of logical probability based on contingency, there is no
way to make a sure statement of the form 'In most contexts,
.' All we can
refer to are: most of the contexts considered
so far; these may in reality be a minority of all possible contexts, for all
we know. Such modal statements are therefore not static, never entirely final.
We have shifted the concept of logical probability from its rigid formal
definition as 'true in most contexts', to a more practical version: 'true in
most known contexts'. It thus is no longer implied to be static; but it
is now flexible, and suggests comparison of credibilities with a reasonable
degree of purpose.
Thus, the concepts of (comparative) credibility and logical probability
ultimately blur, and can to some extent be used interchangeably. However, if we
understand logical probability in its strictest sense, as based
on and implying logical possibility, then it should not be confused with
credibility, which is even applicable to logically impossible propositions
(until their self-contradiction is discovered). Here, I use 'probability' in an
indeterminate sense, so as to avoid the issue.
The main purpose of induction is to lead us to facts, to hopefully true
specific contents. How we know their logical probabilities is not a separate or
additional goal for inductive research; it is one and the same issue with that
of knowing their truths. In the process of pursuit of facts, by evaluating our
current distance from the establishment of truth, we
are incidentally also finding their logical probabilities.
Ultimately, we would like to construct a clear, step-by-step, model
of human knowledge, showing precisely how each proposition in it is arrived at;
but in the meantime, the processes involved can be broadly defined. How exactly
do we get to know these logical gradations? They are not arbitrary, not
expressions of subjective preference, not intuitive guesses; there is a system
to such evaluations.
The investigation of this problem in general terms, that is, without
reference to specific forms, may be called 'adduction'.
Adduction provides us with the rules of evidence and counterevidence, which
allow us to weight the varying probabilities of theses.
The more evidence we adduce for our proposed thesis, the more it is confirmed
(strengthened); the more evidence we adduce to a contrary thesis, the more is
ours undermined (weakened). These
valuations should not be confused with proof and refutation, which refer to the
ideal, extreme powers of evidence.
Adduction is performed by means of the logical relations described by
hypothetical and disjunctive propositions. These, we saw, are normally based on
the separate logical possibility of two theses, and inform us about the logical
modalities of their conjunctions, together or with each other's antitheses. They
establish connections of varying degree, direction, and polarity.
Now, 'If P, then Q' represents necessary connection, the highest level;
it could be stated as 'if P, necessarily Q'. Accordingly, 'if P, then nonQ',
incompatibility, could be stated 'if P, impossibly Q'. The contradictories of
these would be 'if P, possibly Q' (= 'if P, not-then nonQ') and 'if P, possibly
not Q' (= 'if P, not-then Q'). We can, following this pattern, think in terms of
probabilities of connection.
Adductive argument evolves out of apodosis. It most typically takes the
These conclusions, so far, do not express the precise degree of
probability; they do indicate that the possible result has increased
in probability. The possibility of the result is already implicit in the
major premise to some extent. A deductive, necessary, conclusion would not be
justified. But we are one step ahead, in that it is conceivable that the minor
premise is true because the proposed
conclusion was true.
We argue backwards, from the consequent to the antecedent, or from the
denial of the antecedent to the denial of the consequent. As apodosis, this is
of course invalid; but here we view the minor premise as an index
to, rather than proof of, the conclusion.
The more hypotheses suggest a conclusion, the more probably will it turn
out to be true. The less hypotheses suggest a conclusion, the more probably will
it turn out to be false. Thus, 'evidence'
may be defined as whatever increases the logical probability of a thesis by any
amount, and 'counterevidence' refers to sources of decrease.
Through adduction, we mentally shift from incipient credibility and
problemacy, to a more pondered logical probability.
Note that the first mood, the affirmative one, is strictly more correct
than the second, negative, mood. For, in the negative case, we presuppose the
major premise not to be complemented by 'if nonP, then Q', even though the
latter is a formally conceivable adjunct. That is, we are presuming that 'nonQ'
is logically possible, without prior justification, since this is not always
part of the basis of the major premise. Whereas, in the positive case, if 'if
nonP, then Q' were also given, the additional conclusion 'probably not P' would
balance but not strictly contradict 'probably P', and also allow Q to be
It follows that the conclusion of the negative mood is more precisely,
'if nonQ is at all possible, then it is now more probable'. But since, as
earlier pointed out, every proposition is at first encounter logically possible,
this is not a very significant distinction. The issue of basis is more serious
for natural, temporal or extensional conditionals than for logical conditionals.
We can simply say that if 'nonQ' turns out to be logically impossible for
other reasons, then of course the initial possibility is thenceforth annulled.
Such an eventuality is not excluded by the negative adductive argument, just as
the positive version allows for the eventual denial of P, anyway.
Note then that the loose sense of logical probability here intended does
not imply that 'P is logically possible' (in the first mood) or that 'nonQ is
logically possible' (in the second mood), unless these possibilities were part
of the tacit basis of the major premise. Logical possibility must still be
strictly understood as signifying an established necessity or contingency.
Other moods of adduction follow by changing the polarities of theses.
These represent other valuable approaches to provision of evidence or
counterevidence, confirmation or undermining.
Note that if the major premise is contraposed, the conclusion remains the
same. This shows that the listed moods constitute a consistent system.
We can also form disjunctive adductive arguments, like the following,
with any number of theses:
It is clear that if the major
and/or minor premise in all these arguments were probabilistic, instead of fully
necessary or factual, some probability would still be transmitted down to the
conclusion, albeit a proportionately more tenuous one.
This principle of 'transmissibility' of credibility, let us call it, is
very important to logic, because it means that, although deductive logic was
designed with absolutely true premises in mind, its results are still applicable
to premises of only relative truth. Thus, deductive processes also have some
We previously made a clear distinction between the 'uppercase' forms of
hypothetical, like 'if P, then nonQ', which involve a logically necessary
connection, with the lowercase forms, like 'if P, not-then Q', which merely
establish a compatibility. This distinction is especially important in deductive
argument, such as apodosis.
We can conceive of less than necessary major premises, having forms like
'if P, possibly or probably Q'. Some probability is still transmitted down to
the conclusion, though of course again much more tentatively and
insignificantly. We can regard thus arguments like the following as also
adductive; in fact, they are the most comprehensive formats of adductive
In such argument, the probabilities involved may have any degree. Also,
the premises may have very different probabilities; and the probability of the
conclusion depends on the overlap, if any, of the conditions for realization of
the premises, so that it is generally far inferior. It is normally very
difficult to quantify such probabilities precisely; but, when we can estimate
the degrees of the premises, we can accordingly calculate the degree of the
conclusion (which may be zero, if there is no overlap).
We could thus expand our definitions of apodosis and adduction, so that
they are equivocal. In that case apodosis and adduction (in the narrow senses we
adopted) would respectively be: forward and backward apodosis (in the larger
sense), or necessary/deductive and merely-probable/inductive adduction (in the
larger sense). This is mentioned only to show the continuity of the two
Note that when we formulate hypothetical propositions, we often order the
theses according to their probabilities. 'If P, then Q' may intend to implicitly
suggest, that P is so far more probable than Q, and may be used deductively to
improve the probability of Q; or that Q is so far more probable, and may be used
to inductively to raise the probability of P. Tacitly, this signifies an
argument with a necessary major premise, and a probabilistic minor premise and
Similarly, by the way, for disjunctive argument. Premises and conclusion
may have any degrees of logical probability. Also, the minor premise may be
implicit in the major, by virtue of our ordering the alternatives, from the most
likely (mentioned first to attract our attention) to the least (relegated to the
periphery of our attention); or from the least likely (because easiest to
eliminate) to the most (the leftover alternative, when we reach the end of the
We have thus far described adductive argument, but have not yet validated
it. We have to explain why the probable conclusion is justified, and clarify by
how much the logical probability is increased. The answer to this question is
found in the hidden structure of such argument, the pattern of thought which
Let us suppose that P1, P2,
Pn are the full list of all the
conceivable theses, each of which is separately capable of implying Q, so that
the denial of all of them at once results in denial of Q. This means:
If P1, then Q; and if P2, then Q; etc.
or, more succinctly,
If P1 or P2 or
Pn, then Q.
And, since the list is exhaustive,
If not-P1 and not-P2
and not-Pn, then notQ.
In that ideal situation, we can say that if Q is found true, then each of
Pn has prima facie an equal chance of having anteceded that truth. We
know at least one of them must be true (since otherwise Q would be false), but
not precisely which. Each carries an nth part of the total probability which
this necessity embraces. Thus, the degree of probability is in principle
knowable, and the process justifiable.
If one of the alternative antecedents is thereafter found false, the
number of alternatives is decreased, and so the probability of each of the
remainder is proportionately increased. Where only one alternative remains it
becomes maximally probable, that is, necessary; and the conclusion is deductive
rather than adductive or inductive (in the narrow sense).
In practise, we do not always know or consider all the alternatives; even
when we think we are aware of them all, it may only be an assumption, a
generalization. Still, the principle remains, even if the degree of probability
we assign to the conclusion turns out to be inexact. This is because we are here
dealing with logical probability, which is intrinsically tentative and open to
change. That is just the function and raison-dκtre of logical probability,
to monitor the current status of propositions in an evolving body of knowledge.
If, not yet knowing whether Q is true or false, we find one of the
alternatives, say P1, false, we can say that we are one step closer to the
eventuality that all are false, from which the falsehood of Q would follow. In
that case, the probability of Q being false has increased by an increment of
If thereafter say P2 is also found false, the chances of Q being false
are further increased. When all the conditions of that event are fulfilled, the
probability becomes maximal a necessity.
In formal terms, what the above means is that 'If P, necessarily Q' is
convertible to 'If Q, (a bit more)
probably P'. Similarly, 'If P, necessarily Q' is invertible to 'If not P, (a
bit more) probably not Q'. Even if we do not know what, and how many, are
the other shareholders of the overall probability, these inferences retain their
In aetiological terms, we thus have two sources of probability increase.
A thesis (here, P1 for instance) may be rendered more probable by the truth of
another (viz., here, Q), of which it is an alternative contingent cause. Or a
thesis (here, nonQ) may be rendered more probable by the truth of another (viz.,
here, not-P1 for instance), which is a component of a necessary cause of it.
Thus, more broadly, probability is transmitted across the logical
relationship signified by hypotheticals: in both directions, from antecedents to
consequents and vice versa, and to varying degrees, reflecting the intensity of
Each such probability change is relative: it applies within that limited
environment which we projected. In practise, the degree of probability we assign
to a thesis is a complex result of innumerable such incremental changes.
Needless to say, when a thesis is strengthened, its contraries are
proportionately weakened; and vice versa.
A thesis may be increasingly confirmed for a variety of reasons, and at
the same time increasingly undermined for a variety of other reasons. What
matters is its resultant probability, its overall rating, the sum and average of
all the affirming and denying forces impinging upon it, at the present stage of
If follows that, though the alternative theses are, to begin with, of
equal weight, they may, in a broader context, be found of unequal weight. In
that case, we select the relatively most weighty, the logically most probable,
as our preferred thesis at any stage of the proceedings.
All the above can be repeated with respect to disjunctions. Consider two
or more theses, each with some degree of credibility from other sources. If they
are found to be contrary, their credibilities are all proportionately lowered,
since we know they cannot all be true. If they are found to be subcontrary,
their credibilities are all proportionately raised, since we know they cannot
all be false. However, in the case of exact contradictories, their independent
credibilities are unaffected, since their mutual exclusion and exhaustiveness
offset each other.
Lastly, note that we have to clearly discriminate between: exhausting the
known possibilities, on the one hand, and open-mindedness to the eventual
possibility that new alternatives be found one day, on the other hand.
At any given stage in the development of knowledge we have to bow to all
the apparent finalities; this does not prevent us from accepting the principle
that some correction might later be called upon. On the other hand, that
attitude of receptiveness to change should not be allowed to belittle our trust
in acquired certainties.
When all but one of the known theories concerning some phenomena have
been eliminated, or one theory is shown to be their only conceivable
explanation, we must accept our conclusion as final and unassailable, provided
no inconsistency or specific cause for doubt remains. The truth that some such
certainties have in the past been overturned, does not logically imply that this
particular certainty will ever be overturned.
There is a formal difference between the status of logical possibility
within a context, and the general admission that context does change, which
stands outside of any context. They are not identical in power: the former
affects contextual reasoning, the latter plays no active part in deliberations,
being only an open-ended philosophical truth without specific applicability.
We ordinarily think assertorically, in terms of statements like 'if P,
then Q', meaning 'if P is established, then Q may be claimed to be known'. But
sometimes we remain dubious, and say 'if perhaps P, then perhaps Q'. Some people
reason in this manner more often than others, hanging on to uncertainties so
insistently that they inhibit the forward motion of their knowledge.
But such reasoning, which may be called 'problematic logic', is
essentially no different from assertoric logic. Its inferences are exactly
parallel, the only difference is the explicit emphasis it puts on the
probabilities of the theses.
Perhaps the legitimate context for such statements would be whenever we
inquire into eventual developments of knowledge. Right now, say, P is to all
appearances true; but there is always an off-chance that it might turn out not
to be true, after all; in that case, we ask, what
would happen if P was not true. We look ahead, even though we are without
strict justification, in order to be
prepared for eventual alternatives to 'established fact'.
As we saw in the discussion of de-re
conditioning, adduction is also feasible using natural, temporal or
extensional conditionals, but it must be stressed that the emergent probability
is essentially in logical modality. We might call it para-logical probability,
meaning not purely logical, if we wish to underline the faint difference, which
relates to source of judgment.
A categorical proposition always has adductive implications. 'Most (or
Few) S are P' is taken to imply 'This S is probably (or improbably) P'; that is,
for any random S, the logical probability is high (or low) that it will be P, in
proportion to the quantity. We consider the likelihood that the given case of S
happens to be one of those which are P.
Likewise, 'This S is P in most (or few) circumstances' implies 'This S is
probably (or improbably) P' that, for any randomly chosen circumstance, there is
a logical probability that this S will be P in it, commensurate with the number
of natural circumstances favoring such event. We consider the likelihood that
the given circumstance surrounding this S happens to be one of those in which
this S is P. Similarly with temporal modality.
When two or more of the extensional, and natural or temporal, modalities
are involved in a proposition, the logical effect is compounded. The logical
probability is increased (or decreased) to some extent by each of the de-re
modalities, and the resultant is whatever it happens to be.
Such transmission of logical probability, from a plural de-re proposition down to a
single-unit case for the type of modality concerned, on the ground of a
majority or minority of instances, circumstances or times is also to be
found with conditionals. The following are some typifying examples:
In extensional adduction:
Any S which is P, is Q,
and this S is Q therefore, this S is probably P;
or: and this S is not P so, this S is probably not Q.
In natural adduction:
When this S is P, it must be Q,
and this S is Q therefore, it is probably P;
or: and this S is not P so, it is probably not Q.
In temporal adduction:
When this S is P, it is always Q,
and this S is Q therefore, it is probably P;
or: and this S is not P so, it is probably not Q.
These concepts can be further broadened by reference to majoritive or
minoritive conditionals, in arguments like the de-re
adductions here shown, and likewise for corresponding apodoses. Some logical
probability is still transmitted down from premises to conclusion.
Thus, if the major premises in such arguments had been the extensional
'Most (or few) S which are P, are Q', or the natural 'When this S is P, it is in
most (or few) circumstances Q', or the equivalent temporal conditional the
conclusion would still have some degree of logical probability, proportionately
to the numbers of instances, circumstances or times involved. Likewise, in cases
of compound modal type.
If the minor premises were respectively of the form 'Most S are Q' (or
'Most S aren't P'), or 'This S is in most circumstances Q' (or 'This S is in
most circumstances not P'), or the equivalent temporal categorical a
probable conclusion can likewise be drawn. Note, however, that if the minor
premise is of low de-re probability, it does not follow that the conclusion is
likewise of low probability; all we can say is that the conclusion has very
slightly increased in probability. Likewise, in cases of compound modal type.
A probabilistic major premise, of any modal type or combination of modal
types, together with a probabilistic minor premise, of any modal type or
combination of modal types, yield a conclusion of some, though much diminished,
degree of logical probability.
More broadly still, such conditional major premises, and indeed the minor
premises, may have varying degrees of purely logical probabilities as
propositions in a knowledge context, quite apart from the inherent
'para-logical' (de-re) probabilities
just discussed. In that case, the resultant logical probability is still further
We can similarly adduce evidence through de-re
disjunctive adduction, in each or any combination of these types of