## Probability as a qualitative spectrum

This is topic 2 on our philosophy topics list.

Everyday language offers a range of phrases, such as very unlikely or very likely, to express chances qualitatively but not quantitatively, in sentences like

It is very unlikely that convincing evidence of extraterrestrial intelligence will be found in the next year.

It is very likely that the U.S. Presidential election will take place as scheduled on November 6, 2012.

It seems to me self-evident (I'm not sure how to prove it) that in ordinary life people actually do think about uncertainties mostly in this qualitative way -- from "very unlikely" to "very likely" -- without associating any numerical probabilities. Of course modern people are exposed to hearing numerical probabilities and so, if asked to do so, will happily associate some range of numerical values to phrases like "extremely likely". See e.g. the graphic on this page. But this is quite different from saying that, unprompted, people think quantitatively.

The page discusses some aspects of the relation between qualitative and quantitative assessments of probability. Let me start by comparing probability with other aspects of the world that seem (in everyday language) a little more subjective or more objective.

### More subjective concepts

There are aspects of the world which we habitually compare on some ``lesser to greater" spectrum, but with an odd mixture of definite and fuzzy comparisons -- in some cases we say one instance is definitely "less than" another instance, but in other cases we're not sure. Let me use the following two examples.
• Pain. We are sure that a pinprick is less painful than a toothache, but less sure whether today's toothache is less painful than last week's headache was.
• Criminality. We are sure that murder for profit is more criminal than minor theft, but less sure how to compare counterfeiting and embezzlement.
In examples like this we can invent or adapt some numerical scale -- e.g. there are pain scales and one can use length of prison sentences as indicators of seriousness of crimes. In using such scales one implicitly regards them as providing valid "more serious than" comparisons but one does not claim that the numerical values have meaning or that one can do meaningful arithmetic with the numerical values. Such a scale is what I will call a qualitative spectrum.

Another example, more evidently subjective, is

• movie ratings
which one could express on a numerical scale, 1 - 5, say, or equivalently via words terrible, poor, OK, good, excellent.

These three examples involve different kinds of subjective judgments -- physical, ethical, aesthetic. Because there is considerable agreement between different individuals' judgements, it is quite wrong to equate "subjective" with "arbitrary".

### More objective concepts

A medieval peasant would surely have understood that weight and length and durations of time were both objective and quantitatively measurable. There is a long history of measurement, and indeed setting and enforcing standard "weights and measures" is one of the ancient responsibilities of government. Other variable aspects of the world, say temperature or noise level, our peasant would (I guess) have recognized as objective, but would not have thought of trying to measure quantitatively, whereas nowadays we know these can be measured with degrees and decibels.

xxx sensory impressions can be associated with objectively measurable aspects of the real world, xxx not identify objective with quantitative

### Probability: between qualitative and quantitative

To me, a natural first way to categorize "probability" is by comparison with three examples above -- it falls somewhere between the "more subjective" and the "more objective" concepts. Obviously it is not "objectively measurable", except in some very narrow setting of repeatable experiments (xxx cross-ref dogmatic frequentism).

xxx comp with subj?

Let me make several points.

1. The difference between quantitative and qualitative isn't whether you use words or numbers or some creative graphic, it's whether there's some meaningful sense in which 3 is 50% larger than 2.

And in the three examples we have given, there isn't.

2. As mentioned above, it seems to me self-evident that in ordinary life people actually do think about uncertainties mostly in a way -- from "very unlikely" to "very likely" -- without associating any numerical probabilities. In other words, they often think about uncertainty on the same kind "qualitative spectrum" as they think about pain or criminality or movie quality. Arguing that one must associate numerical probabilities to instances of "unlikely" or "likely" -- that is, arguing that it is logically unsound to think qualitatively and not quantitatively -- strikes me as ridiculous, whichever way one looks to find analogs of probability. No-one would argue that it is logically unsound to talk about a "severe headache" without quantifying the degree of pain. And in our personal experience, you and I have used words like "quite" or "noisy" but have rarely or never thought it would be useful to actually measure a sound level in decibels (of course we can imagine contexts where it would be useful).

Implicit in the usual mathematical setup of probability and statistics is the notion that one should associate numerical probabilities to chance events. As a generality, this is clearly ridiculous as regards everyday life and many other non-academic contexts.

3. It seems reasonable to imagine that a person casually stating a 1 in 10,000 chance in the "extraterrestrial intelligence" example or a 99.9% chance in the 2012 election example is actually still thinking qualitatively, but (like giving a movie a 3.5 rating) is assuming that a number will convey the intended place on the spectrum more clearly than a verbal phrase such as "very very unlikely but not impossible".

4. The usefulness of the qualitative spectrum is often under-rated. xxx continue

5. Trying to make a quantitative estimate of a probability is a choice, and one can think of cases where one might not want to do so. I have a rough idea of my chance of living to age 80; typing U.S.A. life expectancy at age 57 into Wolfram|Alpha would give me a crude estimate (population percentage); and in the future genetic tests might give a more accurate personalized estimate. But do I want to know?

More substantially, consider the famous "beyond reasonable doubt" criterion for criminal conviction. The legal profession explicitly refuses to quantify this; if you as a juror asked the judge whether a 97% probability was sufficient, the judge would not give you a straight answer!

6. xxx instead special feature of probability different from other qualitative scales -- can compare with opposite event I don't believe people first estimate numerical probability and then compare to 50% -- instead just compare to opposite

### Quantifying probability as degree of belief

At many places on this site I repeat a desire to avoid being drawn into the traditional Interpretations of Probability debate on what meaning one should attach to a statement like "the chance that team A beats team B in tomorrow's game is 60%". But I do want to say a few words on the concept of probability as degree of belief, because it is important to distinguish two uses of that phrase.

1. When someone says I give this movie a 3.5 rating and then adds but that's just my opinion, the additional comment doesn't add anything, other than as a disclaimer to any other way (e.g. a poll of other people) that a rating might be derived. Similarly, when someone states a 99.9% chance in the 2012 election example, adding "but that's just my degree of belief", the additional comment doesn't add anything: they are really thinking qualitatively and making up a number to represent "very very likely".

So this is the "disclaimer" meaning for the phrase: "degree of belief" as opposed to anything more substantial.

2. Elsewhere (xxx not written) I talk briefly about the idea of probability as a primitive concept, like length, not reducible to anything simpler. One could consider "quantitative degree of belief" similarly as a primitive concept, in other words treat statements like "my degree of belief that O.J. Simpson was the real killer is 95%" as having a meaning that is precise but not reducible to anything simpler. Within this approach, one can then identify "subjective probability" with "degree of belief". But I don't see any advantage to this approach; surely it would be easier to take probability itself as the primitive concept.

3. Also elsewhere (xxx not written) I talk briefly about the betting interpretation of subjective probability, which is quite different, and does provide one genuinely quantitative interpretation of quantitative probability assertions.

xxx somewhere: point of knowing a probability? Trial, climate change.