Showing posts with label probability. Show all posts
Showing posts with label probability. Show all posts

Thursday, February 13, 2025

Anything can be assigned a probability?

In the previous post I mentioned the book Radical Uncertainty: Decision Making Beyond the Numbers by Mervyn King and John Kay. The book, written by two prestigious British economists, attacks the bad use of statistics and probability calculus in fields where they are not always applicable, such as history, economics and the law. Let’s look at a few examples:

  • What do we mean when we say that Liverpool F.C. has a 90% chance of winning the next match? One possible interpretation is that if the match were to be played a hundred times, with the same players and the same weather conditions and the same referee, Liverpool would win 90 times, and draw or lose the other ten. But the match will be played just once. Does it make sense to talk about probabilities? No, because there are no supporting data on frequency. What is meant is that the person speaking believes that Liverpool will win. Nothing more. It is a subjective probability. Milton Friedman wrote: We can treat people as if they assigned numerical probabilities to every conceivable event. (Price Theory, 1962).

Thursday, July 11, 2019

Zero probability


In a previous post I mentioned that an event can happen once or several times, although the probability of its happening is zero. The probability of an event is defined as the ratio of the number of favorable cases to that of possible cases. Therefore, if the number of possible cases is infinite, while that of favorable cases is finite, the probability turns out to be zero.
At first glance it seems incredible that an event with zero probability can actually happen. I think the matter will be clearer with a simple example. Two friends, A and B, are talking, and what they say is this:
A: If I ask you to choose a number between 1 and 100, what is the probability that you choose a specific number, such as 25?
B: 1/100, obviously.
A: If I ask you to choose a number between 1 and 1000, what is the probability that you choose 25?
B: 1/1000.
A: If I ask you to choose a number between 1 and 10,000, what is the probability that you choose 25?
B: 1/10,000.
A: If I ask you to choose a positive integer number, what is the probability that you choose 25?
B: Zero, for the set of integers has infinite elements, and one divided by infinity is equal to zero.
A: Choose any number among all the positive integers and tell me which number you have chosen.
B: I choose 22500-1.
A: You have just made an event with zero  probability.
Thinking a little you’ll see that the probability of choosing, among all the integers, any finite set, however large, is also zero. For instance:
A: If I ask you to choose ten different numbers between one and one hundred, what is the probability that you choose precisely the numbers between 11 and 20? (their order does not matter)
B: 1 / 17,310,309,456,440
A: And if I ask you to choose ten different numbers among all the positive integers, what is the probability that you choose precisely the numbers between 11 and 20?
B: Zero.
I leave to the curious reader to compute why the probability of choosing numbers 11 to 20 among those from one to one hundred is precisely what B has stated.
To finish this post, I’ll propose a few more exercises for the reader. Whoever solves them has the opportunity to write a comment explaining how they arrived to the solution.
1.      What is the last digit of 62500?
2.      What is the penultimate digit of 62500?
3.      What is the penultimate digit of 61,000,000?
4.      What is the probability that the last digit of 6n is odd?
5.      What is the probability that the penultimate digit of 6n is odd?
By Vincent Pantaloni, CC BY-SA 4.0, Wikimedia Commons

The same post in Spanish
Thematic Thread on Statistics: Previous Next
Manuel Alfonseca
Happy summer holidays. See you by mid-August

Thursday, February 5, 2015

The monkey pounding on a typewriter

In connection with the fine tuning problem, the argument of the typist monkey is often used as an illustration that even very unlikely events can occur spontaneously. Depending on the author of the quote, the text supposedly written by the monkey can be the complete works of Shakespeare, Don Quixote, or even a shorter and less specific work. For instance, John Leslie, in his book Universes (Chapter One), writes:
Our universe can indeed look as if designed. In reality, though,  it may be merely the sort of thing to be expected sooner or later. Given sufficient many years with a typewriter, even a monkey would produce a sonnet.

Thursday, December 4, 2014

The fine tuning problem

In two previous posts I dealt with the relation between the multiverse theories and the problem of fine tuning, noting that those theories do not solve the problem. This third post describes briefly what is the fine tuning problem.
Brandon Carter
In 1973 Brandon Carter formulated the anthropic principle, a name later deplored by its author, because it may be prone to misunderstandings. This principle is simply the verification that the universe must fulfill all the conditions necessary for our existence, since we are here.
Over a decade later, John Barrow and Frank Tipler published a book entitled The anthropic cosmological principle, which offered a stronger version of the anthropic principle, posing that the values of many of the universal constants are critical and minor variations would make life impossible. This finding raises the fine tuning problem, based on the analysis of the possible effects of changing the values of those constants. In other words, the universe seems designed to make life possible. Let’s look at a few examples:

Thursday, November 6, 2014

The probability of existence of extra-terrestrial intelligence

Normal statistical distribution.
The text makes reference to a uniform statistical distribution.
Probability is a well-known mathematical concept that was initially defined to quantify random data in mathematically known environments and has been extended to other situations.
For instance, the probability that the next car passing near me has a license plate with four identical figures is computed by dividing the number of favorable cases between the number of possible cases. The first number is ten: 0000, 1111, 2222, ... , 9999. The second is ten thousand: 0000, 0001, 0002, ... , 9998, 9999, in a uniform distribution. Therefore the indicated probability can be computed as one thousandth. Here we haven’t considered that vehicles can be removed from circulation, an independent random process that would not change significantly the result of the computation.
The problem is, sometimes we are interested in computing data in mathematically unknown environments. This can happen, for instance, when we ignore the number of favorable cases, or the number of possible cases, or both. In such situations, we can estimate the unknown data with more or less uncertainty. We speak then of a priori probability.