Jump to content

A System of Logic, Ratiocinative and Inductive/Chapter 33

From Wikisource

CHAPTER XVIII.

Of The Calculation Of Chances.

§ 1. "Probability," says Laplace,[1] "has reference partly to our ignorance, partly to our knowledge. We know that among three or more events, one, and only one, must happen; but there is nothing leading us to believe that any one of them will happen rather than the others. In this state of indecision, it is impossible for us to pronounce with certainty on their occurrence. It is, however, probable that any one of these events, selected at pleasure, will not take place; because we perceive several cases, all equally possible, which exclude its occurrence, and only one which favors it.

"The theory of chances consists in reducing all events of the same kind to a certain number of cases equally possible, that is, such that we are equally undecided as to their existence; and in determining the number of these cases which are favorable to the event of which the probability is sought. The ratio of that number to the number of all the possible cases is the measure of the probability; which is thus a fraction, having for its numerator the number of cases favorable to the event, and for its denominator the number of all the cases which are possible."

To a calculation of chances, then, according to Laplace, two things are necessary; we must know that of several events some one will certainly happen, and no more than one; and we must not know, nor have any reason to expect, that it will be one of these events rather than another. It has been contended that these are not the only requisites, and that Laplace has overlooked, in the general theoretical statement, a necessary part of the foundation of the doctrine of chances. To be able (it has been said) to pronounce two events equally probable, it is not enough that we should know that one or the other must happen, and should have no grounds for conjecturing which. Experience must have shown that the two events are of equally frequent occurrence. Why, in tossing up a half-penny, do we reckon it equally probable that we shall throw cross or pile? Because we know that in any great number of throws, cross and pile are thrown about equally often; and that the more throws we make, the more nearly the equality is perfect. We may know this if we please by actual experiment, or by the daily experience which life affords of events of the same general character, or, deductively, from the effect of mechanical laws on a symmetrical body acted upon by forces varying indefinitely in quantity and direction. We may know it, in short, either by specific experience, or on the evidence of our general knowledge of nature. But, in one way or the other, we must know it, to justify us in calling the two events equally probable; and if we knew it not, we should proceed as much at hap-hazard in staking equal sums on the result, as in laying odds.

This view of the subject was taken in the first edition of the present work; but I have since become convinced that the theory of chances, as conceived by Laplace and by mathematicians generally, has not the fundamental fallacy which I had ascribed to it.

We must remember that the probability of an event is not a quality of the event itself, but a mere name for the degree of ground which we, or some one else, have for expecting it. The probability of an event to one person is a different thing from the probability of the same event to another, or to the same person after he has acquired additional evidence. The probability to me, that an individual of whom I know nothing but his name will die within the year, is totally altered by my being told the next minute that he is in the last stage of a consumption. Yet this makes no difference in the event itself, nor in any of the causes on which it depends. Every event is in itself certain, not probable; if we knew all, we should either know positively that it will happen, or positively that it will not. But its probability to us means the degree of expectation of its occurrence, which we are warranted in entertaining by our present evidence.

Bearing this in mind, I think it must be admitted, that even when we have no knowledge whatever to guide our expectations, except the knowledge that what happens must be some one of a certain number of possibilities, we may still reasonably judge, that one supposition is more probable to us than another supposition; and if we have any interest at stake, we shall best provide for it by acting conformably to that judgment.

§ 2. Suppose that we are required to take a ball from a box, of which we only know that it contains balls both black and white, and none of any other color. We know that the ball we select will be either a black or a white ball; but we have no ground for expecting black rather than white, or white rather than black. In that case, if we are obliged to make a choice, and to stake something on one or the other supposition, it will, as a question of prudence, be perfectly indifferent which; and we shall act precisely as we should have acted if we had known beforehand that the box contained an equal number of black and white balls. But though our conduct would be the same, it would not be founded on any surmise that the balls were in fact thus equally divided; for we might, on the contrary, know by authentic information that the box contained ninety-nine balls of one color, and only one of the other; still, if we are not told which color has only one, and which has ninety-nine, the drawing of a white and of a black ball will be equally probable to us. We shall have no reason for staking any thing on the one event rather than on the other; the option between the two will be a matter of indifference; in other words, it will be an even chance.

But let it now be supposed that instead of two there are three colors—white, black, and red; and that we are entirely ignorant of the proportion in which they are mingled. We should then have no reason for expecting one more than another, and if obliged to bet, should venture our stake on red, white, or black with equal indifference. But should we be indifferent whether we betted for or against some one color, as, for instance, white? Surely not. From the very fact that black and red are each of them separately equally probable to us with white, the two together must be twice as probable. We should in this case expect not white rather than white, and so much rather that we would lay two to one upon it. It is true, there might, for aught we knew, be more white balls than black and red together; and if so, our bet would, if we knew more, be seen to be a disadvantageous one. But so also, for aught we knew, might there be more red balls than black and white, or more black balls than white and red, and in such cases the effect of additional knowledge would be to prove to us that our bet was more advantageous than we had supposed it to be. There is in the existing state of our knowledge a rational probability of two to one against white; a probability fit to be made a basis of conduct. No reasonable person would lay an even wager in favor of white against black and red; though against black alone or red alone he might do so without imprudence.

The common theory, therefore, of the calculation of chances, appears to be tenable. Even when we know nothing except the number of the possible and mutually excluding contingencies, and are entirely ignorant of their comparative frequency, we may have grounds, and grounds numerically appreciable, for acting on one supposition rather than on another; and this is the meaning of Probability.

§ 3. The principle, however, on which the reasoning proceeds, is sufficiently evident. It is the obvious one that when the cases which exist are shared among several kinds, it is impossible that each of those kinds should be a majority of the whole: on the contrary, there must be a majority against each kind, except one at most; and if any kind has more than its share in proportion to the total number, the others collectively must have less. Granting this axiom, and assuming that we have no ground for selecting any one kind as more likely than the rest to surpass the average proportion, it follows that we can not rationally presume this of any, which we should do if we were to bet in favor of it, receiving less odds than in the ratio of the number of the other kinds. Even, therefore, in this extreme case of the calculation of probabilities, which does not rest on special experience at all, the logical ground of the process is our knowledge—such knowledge as we then have—of the laws governing the frequency of occurrence of the different cases; but in this case the knowledge is limited to that which, being universal and axiomatic, does not require reference to specific experience, or to any considerations arising out of the special nature of the problem under discussion.

Except, however, in such cases as games of chance, where the very purpose in view requires ignorance instead of knowledge, I can conceive no case in which we ought to be satisfied with such an estimate of chances as this—an estimate founded on the absolute minimum of knowledge respecting the subject. It is plain that, in the case of the colored balls, a very slight ground of surmise that the white balls were really more numerous than either of the other colors, would suffice to vitiate the whole of the calculations made in our previous state of indifference. It would place us in that position of more advanced knowledge, in which the probabilities, to us, would be different from what they were before; and in estimating these new probabilities we should have to proceed on a totally different set of data, furnished no longer by mere counting of possible suppositions, but by specific knowledge of facts. Such data it should always be our endeavor to obtain; and in all inquiries, unless on subjects equally beyond the range of our means of knowledge and our practical uses, they may be obtained, if not good, at least better than none at all.[2]

It is obvious, too, that even when the probabilities are derived from observation and experiment, a very slight improvement in the data, by better observations, or by taking into fuller consideration the special circumstances of the case, is of more use than the most elaborate application of the calculus to probabilities founded on the data in their previous state of inferiority. The neglect of this obvious reflection has given rise to misapplications of the calculus of probabilities which have made it the real opprobrium of mathematics. It is sufficient to refer to the applications made of it to the credibility of witnesses, and to the correctness of the verdicts of juries. In regard to the first, common sense would dictate that it is impossible to strike a general average of the veracity and other qualifications for true testimony of mankind, or of any class of them; and even if it were possible, the employment of it for such a purpose implies a misapprehension of the use of averages, which serve, indeed, to protect those whose interest is at stake, against mistaking the general result of large masses of instances, but are of extremely small value as grounds of expectation in any one individual instance, unless the case be one of those in which the great majority of individual instances do not differ much from the average. In the case of a witness, persons of common sense would draw their conclusions from the degree of consistency of his statements, his conduct under cross-examination, and the relation of the case itself to his interests, his partialities, and his mental capacity, instead of applying so rude a standard (even if it were capable of being verified) as the ratio between the number of true and the number of erroneous statements which he may be supposed to make in the course of his life.

Again, on the subject of juries or other tribunals, some mathematicians have set out from the proposition that the judgment of any one judge or juryman is, at least in some small degree, more likely to be right than wrong, and have concluded that the chance of a number of persons concurring in a wrong verdict is diminished the more the number is increased; so that if the judges are only made sufficiently numerous, the correctness of the judgment may be reduced almost to certainty. I say nothing of the disregard shown to the effect produced on the moral position of the judges by multiplying their numbers, the virtual destruction of their individual responsibility, and weakening of the application of their minds to the subject. I remark only the fallacy of reasoning from a wide average to cases necessarily differing greatly from any average. It may be true that, taking all causes one with another, the opinion of any one of the judges would be oftener right than wrong; but the argument forgets that in all but the more simple cases, in all cases in which it is really of much consequence what the tribunal is, the proposition might probably be reversed; besides which, the cause of error, whether arising from the intricacy of the case or from some common prejudice or mental infirmity, if it acted upon one judge, would be extremely likely to affect all the others in the same manner, or at least a majority, and thus render a wrong instead of a right decision more probable the more the number was increased.

These are but samples of the errors frequently committed by men who, having made themselves familiar with the difficult formulæ which algebra affords for the estimation of chances under suppositions of a complex character, like better to employ those formulæ in computing what are the probabilities to a person half informed about a case than to look out for means of being better informed. Before applying the doctrine of chances to any scientific purpose, the foundation must be laid for an evaluation of the chances, by possessing ourselves of the utmost attainable amount of positive knowledge. The knowledge required is that of the comparative frequency with which the different events in fact occur. For the purposes, therefore, of the present work, it is allowable to suppose that conclusions respecting the probability of a fact of a particular kind rest on our knowledge of the proportion between the cases in which facts of that kind occur, and those in which they do not occur; this knowledge being either derived from specific experiment, or deduced from our knowledge of the causes in operation which tend to produce, compared with those which tend to prevent, the fact in question.

Such calculation of chances is grounded on an induction; and to render the calculation legitimate, the induction must be a valid one. It is not less an induction, though it does not prove that the event occurs in all cases of a given description, but only that out of a given number of such cases it occurs in about so many. The fraction which mathematicians use to designate the probability of an event is the ratio of these two numbers; the ascertained proportion between the number of cases in which the event occurs and the sum of all the cases, those in which it occurs and in which it does not occur, taken together. In playing at cross and pile, the description of cases concerned are throws, and the probability of cross is one-half, because if we throw often enough cross is thrown about once in every two throws. In the cast of a die, the probability of ace is one-sixth; not simply because there are six possible throws, of which ace is one, and because we do not know any reason why one should turn up rather than another—though I have admitted the validity of this ground in default of a better—but because we do actually know, either by reasoning or by experience, that in a hundred or a million of throws ace is thrown in about one-sixth of that number, or once in six times.

§ 4. I say, "either by reasoning or by experience," meaning specific experience. But in estimating probabilities, it is not a matter of indifference from which of these two sources we derive our assurance. The probability of events, as calculated from their mere frequency in past experience, affords a less secure basis for practical guidance than their probability as deduced from an equally accurate knowledge of the frequency of occurrence of their causes.

The generalization that an event occurs in ten out of every hundred cases of a given description, is as real an induction as if the generalization were that it occurs in all cases. But when we arrive at the conclusion by merely counting instances in actual experience, and comparing the number of cases in which A has been present with the number in which it has been absent, the evidence is only that of the Method of Agreement, and the conclusion amounts only to an empirical law. We can make a step beyond this when we can ascend to the causes on which the occurrence of A or its non-occurrence will depend, and form an estimate of the comparative frequency of the causes favorable and of those unfavorable to the occurrence. These are data of a higher order, by which the empirical law derived from a mere numerical comparison of affirmative and negative instances will be either corrected or confirmed, and in either case we shall obtain a more correct measure of probability than is given by that numerical comparison. It has been well remarked that in the kind of examples by which the doctrine of chances is usually illustrated, that of balls in a box, the estimate of probabilities is supported by reasons of causation, stronger than specific experience. "What is the reason that in a box where there are nine black balls and one white, we expect to draw a black ball nine times as much (in other words, nine times as often, frequency being the gauge of intensity in expectation) as a white? Obviously because the local conditions are nine times as favorable; because the hand may alight in nine places and get a black ball, while it can only alight in one place and find a white ball; just for the same reason that we do not expect to succeed in finding a friend in a crowd, the conditions in order that we and he should come together being many and difficult. This of course would not hold to the same extent were the white balls of smaller size than the black, neither would the probability remain the same; the larger ball would be much more likely to meet the hand."[3]

It is, in fact, evident that when once causation is admitted as a universal law, our expectation of events can only be rationally grounded on that law. To a person who recognizes that every event depends on causes, a thing's having happened once is a reason for expecting it to happen again, only because proving that there exists, or is liable to exist, a cause adequate to produce it.[4] The frequency of the particular event, apart from all surmise respecting its cause, can give rise to no other induction than that per enumerationem simplicem; and the precarious inferences derived from this are superseded, and disappear from the field as soon as the principle of causation makes its appearance there.

Notwithstanding, however, the abstract superiority of an estimate of probability grounded on causes, it is a fact that in almost all cases in which chances admit of estimation sufficiently precise to render their numerical appreciation of any practical value, the numerical data are not drawn from knowledge of the causes, but from experience of the events themselves. The probabilities of life at different ages or in different climates; the probabilities of recovery from a particular disease; the chances of the birth of male or female offspring; the chances of the destruction of houses or other property by fire; the chances of the loss of a ship in a particular voyage, are deduced from bills of mortality, returns from hospitals, registers of births, of shipwrecks, etc., that is, from the observed frequency not of the causes, but of the effects. The reason is, that in all these classes of facts the causes are either not amenable to direct observation at all, or not with the requisite precision, and we have no means of judging of their frequency except from the empirical law afforded by the frequency of the effects. The inference does not the less depend on causation alone. We reason from an effect to a similar effect by passing through the cause. If the actuary of an insurance office infers from his tables that among a hundred persons now living of a particular age, five on the average will attain the age of seventy, his inference is legitimate, not for the simple reason that this is the proportion who have lived till seventy in times past, but because the fact of their having so lived shows that this is the proportion existing, at that place and time, between the causes which prolong life to the age of seventy and those tending to bring it to an earlier close.[5]

§ 5. From the preceding principles it is easy to deduce the demonstration of that theorem of the doctrine of probabilities which is the foundation of its application to inquiries for ascertaining the occurrence of a given event, or the reality of an individual fact. The signs or evidences by which a fact is usually proved are some of its consequences; and the inquiry hinges upon determining what cause is most likely to have produced a given effect. The theorem applicable to such investigations is the Sixth Principle in Laplace's "Essai Philosophique sur les Probabilités," which is described by him as the "fundamental principle of that branch of the Analysis of Chances which consists in ascending from events to their causes."[6]

Given an effect to be accounted for, and there being several causes which might have produced it, but of the presence of which in the particular case nothing is known; the probability that the effect was produced by any one of these causes is as the antecedent probability of the cause, multiplied by the probability that the cause, if it existed, would have produced the given effect.

Let M be the effect, and A, B, two causes, by either of which it might have been produced. To find the probability that it was produced by the one and not by the other, ascertain which of the two is most likely to have existed, and which of them, if it did exist, was most likely to produce the effect M: the probability sought is a compound of these two probabilities.

Case I. Let the causes be both alike in the second respect: either A or B, when it exists, being supposed equally likely (or equally certain) to produce M; but let A be in itself twice as likely as B to exist, that is, twice as frequent a phenomenon. Then it is twice as likely to have existed in this case, and to have been the cause which produced M.

For, since A exists in nature twice as often as B, in any 300 cases in which one or other existed, A has existed 200 times and B 100. But either A or B must have existed wherever M is produced; therefore, in 300 times that M is produced, A was the producing cause 200 times, B only 100, that is, in the ratio of 2 to 1. Thus, then, if the causes are alike in their capacity of producing the effect, the probability as to which actually produced it is in the ratio of their antecedent probabilities.

Case II. Reversing the last hypothesis, let us suppose that the causes are equally frequent, equally likely to have existed, but not equally likely, if they did exist, to produce M; that in three times in which A occurs, it produces that effect twice, while B, in three times, produces it only once. Since the two causes are equally frequent in their occurrence; in every six times that either one or the other exists, A exists three times and B three times. A, of its three times, produces M in two; B, of its three times, produces M in one. Thus, in the whole six times, M is only produced thrice; but of that thrice it is produced twice by A, once only by B. Consequently, when the antecedent probabilities of the causes are equal, the chances that the effect was produced by them are in the ratio of the probabilities that if they did exist they would produce the effect.

Case III. The third case, that in which the causes are unlike in both respects, is solved by what has preceded. For, when a quantity depends on two other quantities, in such a manner that while either of them remains constant it is proportional to the other, it must necessarily be proportional to the product of the two quantities, the product being the only function of the two which obeys that law of variation. Therefore, the probability that M was produced by either cause, is as the antecedent probability of the cause, multiplied by the probability that if it existed it would produce M. Which was to be demonstrated.

Or we may prove the third case as we proved the first and second. Let A be twice as frequent as B, and let them also be unequally likely, when they exist, to produce M; let A produce it twice in four times, B thrice in four times. The antecedent probability of A is to that of B as 2 to 1; the probabilities of their producing M are as 2 to 3; the product of these ratios is the ratio of 4 to 3; and this will be the ratio of the probabilities that A or B was the producing cause in the given instance. For, since A is twice as frequent as B, out of twelve cases in which one or other exists, A exists in 8 and B in 4. But of its eight cases, A, by the supposition, produces M in only 4, while B of its four cases produces M in 3. M, therefore, is only produced at all in seven of the twelve cases; but in four of these it is produced by A, in three by B; hence the probabilities of its being produced by A and by B are as 4 to 3, and are expressed by the fractions 4/7 and 3/7. Which was to be demonstrated.

§ 6. It remains to examine the bearing of the doctrine of chances on the peculiar problem which occupied us in the preceding chapter, namely, how to distinguish coincidences which are casual from those which are the result of law; from those in which the facts which accompany or follow one another are somehow connected through causation.

The doctrine of chances affords means by which, if we knew the average number of coincidences to be looked for between two phenomena connected only casually, we could determine how often any given deviation from that average will occur by chance. If the probability of any casual coincidence, considered in itself, be , the probability that the same coincidence will be repeated n times in succession is . For example, in one throw of a die the probability of ace being ; the probability of throwing ace twice in succession will be 1 divided by the square of 6, or . For ace is thrown at the first throw once in six, or six in thirty-six times, and of those six, the die being cast again, ace will be thrown but once; being altogether once in thirty-six times. The chance of the same cast three times successively is, by a similar reasoning, or ; that is, the event will happen, on a large average, only once in two hundred and sixteen throws.

We have thus a rule by which to estimate the probability that any given series of coincidences arises from chance, provided we can measure correctly the probability of a single coincidence. If we can obtain an equally precise expression for the probability that the same series of coincidences arises from causation, we should only have to compare the numbers. This, however, can rarely be done. Let us see what degree of approximation can practically be made to the necessary precision.

The question falls within Laplace's sixth principle, just demonstrated. The given fact, that is to say, the series of coincidences, may have originated either in a casual conjunction of causes or in a law of nature. The probabilities, therefore, that the fact originated in these two modes, are as their antecedent probabilities, multiplied by the probabilities that if they existed they would produce the effect. But the particular combination of chances, if it occurred, or the law of nature if real, would certainly produce the series of coincidences. The probabilities, therefore, that the coincidences are produced by the two causes in question are as the antecedent probabilities of the causes. One of these, the antecedent probability of the combination of mere chances which would produce the given result, is an appreciable quantity. The antecedent probability of the other supposition may be susceptible of a more or less exact estimation, according to the nature of the case.

In some cases, the coincidence, supposing it to be the result of causation at all, must be the result of a known cause; as the succession of aces, if not accidental, must arise from the loading of the die. In such cases we may be able to form a conjecture as to the antecedent probability of such a circumstance from the characters of the parties concerned, or other such evidence; but it would be impossible to estimate that probability with any thing like numerical precision. The counter-probability, however, that of the accidental origin of the coincidence, dwindling so rapidly as it does at each new trial, the stage is soon reached at which the chance of unfairness in the die, however small in itself, must be greater than that of a casual coincidence; and on this ground, a practical decision can generally be come to without much hesitation, if there be the power of repeating the experiment.

When, however, the coincidence is one which can not be accounted for by any known cause, and the connection between the two phenomena, if produced by causation, must be the result of some law of nature hitherto unknown; which is the case we had in view in the last chapter; then, though the probability of a casual coincidence may be capable of appreciation, that of the counter-supposition, the existence of an undiscovered law of nature, is clearly unsusceptible of even an approximate valuation. In order to have the data which such a case would require, it would be necessary to know what proportion of all the individual sequences or co-existences occurring in nature are the result of law, and what proportion are mere casual coincidences. It being evident that we can not form any plausible conjecture as to this proportion, much less appreciate it numerically, we can not attempt any precise estimation of the comparitive probabilities. But of this we are sure, that the detection of an unknown law of nature—of some previously unrecognized constancy of conjunction among phenomena—is no uncommon event. If, therefore, the number of instances in which a coincidence is observed, over and above that which would arise on the average from the mere concurrence of chances, be such that so great an amount of coincidences from accident alone would be an extremely uncommon event; we have reason to conclude that the coincidence is the effect of causation, and may be received (subject to correction from further experience) as an empirical law. Further than this, in point of precision, we can not go; nor, in most cases, is greater precision required, for the solution of any practical doubt.[7]


  1. Essai Philosophique sur les Probabilités, fifth Paris edition, p. 7.
  2. It even appears to me that the calculation of chances, where there are no data grounded either on special experience or on special inference, must, in an immense majority of cases, break down, from sheer impossibility of assigning any principle by which to be guided in setting out the list of possibilities. In the case of the colored balls we have no difficulty in making the enumeration, because we ourselves determine what the possibilities shall be. But suppose a case more analogous to those which occur in nature: instead of three colors, let there be in the box all possible colors, we being supposed ignorant of the comparative frequency with which different colors occur in nature, or in the productions of art. How is the list of cases to be made out? Is every distinct shade to count as a color? If so, is the test to be a common eye, or an educated eye—a painter's, for instance? On the answer to these questions would depend whether the chances against some particular color would be estimated at ten, twenty, or perhaps five hundred to one. While if we knew from experience that the particular color occurs on an average a certain number of times in every hundred or thousand, we should not require to know any thing either of the frequency or of the number of the other possibilities.
  3. Prospective Review for February, 1850.
  4. "If this be not so, why do we feel so much more probability added by the first instance than by any single subsequent instance? Why, except that the first instance gives us its possibility (a cause adequate to it), while every other only gives us the frequency of its conditions? If no reference to a cause be supposed, possibility would have no meaning; yet it is clear that, antecedent to its happening, we might have supposed the event impossible, i.e., have believed that there was no physical energy really existing in the world equal to producing it. . . . . . . After the first time of happening, which is, then, more important to the whole probability than any other single instance (because proving the possibility), the number of times becomes important as an index to the intensity or extent of the cause, and its independence of any particular time. If we took the case of a tremendous leap, for instance, and wished to form an estimate of the probability of its succeeding a certain number of times; the first instance, by showing its possibility (before doubtful) is of the most importance; but every succeeding leap shows the power to be more perfectly under control, greater and more invariable, and so increases the probability; and no one would think of reasoning in this case straight from one instance to the next, without referring to the physical energy which each leap indicated. Is it not, then, clear that we do not ever" (let us rather say, that we do not in an advanced state of our knowledge) "conclude directly from the happening of an event to the probability of its happening again; but that we refer to the cause, regarding the past cases as an index to the cause, and the cause as our guide to the future?"—Ibid.
  5. The writer last quoted says that the valuation of chances by comparing the number of cases in which the event occurs with the number in which it does not occur, "would generally be wholly erroneous," and "is not the true theory of probability." It is at least that which forms the foundation of insurance, and of all those calculations of chances in the business of life which experience so abundantly verifies. The reason which the reviewer gives for rejecting the theory is, that it "would regard an event as certain which had hitherto never failed; which is exceedingly far from the truth, even for a very large number of constant successes." This is not a defect in a particular theory, but in any theory of chances. No principle of evaluation can provide for such a case as that which the reviewer supposes. If an event has never once failed, in a number of trials sufficient to eliminate chance, it really has all the certainty which can be given by an empirical law; it is certain during the continuance of the same collocation of causes which existed during the observations. If it ever fails, it is in consequence of some change in that collocation. Now, no theory of chances will enable us to infer the future probability of an event from the past, if the causes in operation, capable of influencing the event, have intermediately undergone a change.
  6. Pp. 18, 19. The theorem is not stated by Laplace in the exact terms in which I have stated it; but the identity of import of the two modes of expression is easily demonstrable.
  7. For a fuller treatment of the many interesting questions raised by the theory of probabilities, I may now refer to a recent work by Mr. Venn, Fellow of Caius College, Cambridge, "The Logic of Chance;" one of the most thoughtful and philosophical treatises on any subject connected with Logic and Evidence which have been produced, to my knowledge, for many years. Some criticisms contained in it have been very useful to me in revising the corresponding chapters of the present work. In several of Mr. Venn's opinions, however, I do not agree. What these are will be obvious to any reader of Mr. Venn's work who is also a reader of this.