this blog is

24 January 2007

Verisimilitude Me Use Brain

climate information quality criticalthinking


The Quality of Information is Not Strain'd

Some half-arsed pseudo-philosophical ramblings follow. It starts with some musings on transistors and eventually circles around to show some relevance to current events, specifically the climate change debate. But be warned, it will take a while.

The Secret Life of Transistors

When I was at university a professor said to me that there were probably only a handful of people in the world who understand how a transistor works. Such are the complexities of the science behind this seemingly simple device.

Many times in my software development career I had pondered this, particularly when confronted with a particularly nasty bug. In these circumstances I pride myself on being fairly unafraid of digging deep into the inner workings of the machine. I had the confidence to learn what I needed to know in order to successfully fix the problem, if not totally discover its root cause. However, knowing that the workings of the transistor were to all intents and purposes out of reach, was a somewhat sobering thought. Not that I have — to my knowledge anyway — faced any bugs that ultimately required an understanding of the transistor. But this limitation was always lurking there at the back of my mind.

In attaining maturity, I expect that everyone has a similar realisation. Given the finite amount of time that we all have available to us, there is no way that we can know everything about everything. So at some point we have to take things upon faith.

The F Word

Faith is possibly the wrong word because it involves belief in the absence of evidence. In the case of the transistor, all the facts are there for me to discover, but I simply don't have the time to delve in there and discover them for myself. When the software crashes and I'm up against a deadline, I'm not going to reach for the physics textbook. So like most normal people, I accept certain facts "on faith". Without direct knowledge, but with a certain amount of confidence in the word of others.

But on the other hand, even the experts in transistors will have to defer first to other scientists, and then ultimately to the limits of human knowledge in the fundamental workings of the universe. In practice, of course, you are unlikely to ever reach the limit of human knowledge in solving a problem. Instead you are likely to reach a point where highly specialised knowledge is required, requiring more in-depth study than is warranted by the problem at hand. Then you have to fall back on other techniques, if you are to proceed.

The Quality of Information is Not Strain'd

When it is impractical to acquire detailed knowledge for yourself, you are often forced to rely on the information of others. This in turn brings in a more subjective analysis of the information, relying on secondary indicators. For convenience I will refer to this as the act of assessing the quality of the information.

We all know how to assess the quality of the claims of others. You look at the motivations that cause people to say one thing over another. You look at their past history of credible reporting. You look at the level of respect showed by their peers. You look at whether their views are disputed or not. Whether or not anyone else has examined them critically. All the usual sorts of things.

What you're chasing is an assessment of the quality of the information. You don't know that it's right based on empirical evidence and your expertise, instead you try to recognise the characteristics that correct information often exhibits. Note that I'm not talking about "truthiness", where you're ignoring all evidence and fundamentally relying on wishful thinking. I'm just saying that in practice critical thinking often involves relying on secondary indicators, in lieu of a detailed and objective analysis of the information itself.

The Sceptical And The Naïve

With any subjective analysis, there is plenty of room for disagreement between individuals, when examining the same information. For example, you may think that the New York Times is a rag and not to be trusted, and be immediately suspicious of information published there. And of course I may think the opposite. However I would contend that with most qualitative judgements, broad consensus can be reached, even if only at the extremes. There would be, I imagine, very little disagreement that an average bottle of Châteauneuf-du-Pape was of superior quality than even the best bottle of Tyrell's Long Flat Red, even amongst people who don't drink wine often.

I think that the degree of scepticism we exhibit corresponds to our standards of acceptable information quality. Sceptical (perhaps critical is a better adjective) people will demand higher quality of information, and naïve people will be satisfied with lower quality.

It's fiendishly hard to remove one's own prejudice when assessing claims of others. If you are adverse to a given claim for whatever reason you're going to naturally require a higher standard of proof from others. Likewise if you're receptive to an idea, perhaps because it benefits you, you're already predisposed to believe it. We are all susceptible to such wishful thinking, and resisting it requires a great degree of self-awareness.

Like all judgements they can of course be inconsistent, and this sometimes raises questions about the person making the judgement. If someone is presented with two objectively similar items, and they reject one — and only one — as low quality, you have to wonder if they know what they're talking about. What criteria were they using to assess quality? For example, if someone tried to tell you that the Aston Martin DB5 was a masterpiece, but the DB6 was an automotive abomination on par with the Ford Pinto, you'd have to be wary about any vehicular advice they gave subsequently.

So it is with the subjective quality of information. I can understand being naive or being critical, but being alternately one then the other is not generally commendable behaviour.

Analysis Paralysis

There is however, at least one factor which might legitimately cause a person to lower their standards, and accept low-quality information. That factor is time.

When a course of action must be decided based on certain information, that information needs to be assessed up to the point when lack of action becomes the biggest risk. So if you're being chased by a bear, there's only so much time you can take to assess whether or not the bridge across the stream is safe. At some point you just have to cross the bridge, or swim the stream, regardless of how unsafe either alternative looks.

In decision making ideally you'd like to find a balance between the risks of inaction on the one hand, and the uncertainty of information on the other. As time increases, you get more information about the situation, but less time in which to act on it. You don't want to act on incomplete information, but on the other hand you still need to act. It's often a tricky call to decide when (or whether) to act based on incomplete information.

And Now For Something Completely Topical

For a long time our dear Prime Minister maintained that he was unsatisfied with the quality of information relating to climate change. Obviously he isn't a climate scientist, but instead relies on an assessment of the information presented to him. For a long time he apparently had high standards of quality for such information.

Then one day he announced he was satisfied with the evidence and conceded that anthropomorphic climate change was a reality and that something needed to be done to reduce carbon emissions. In order to decide on a course of action so he had to assess the quality of the options available.

The information about the Prime Minister's option for "clean" coal is of spectacularly low quality. It is almost completely unproven that it will work at all, let alone meet the required emission targets. (Nuclear power is, on the other hand, at least a more-or-less known risk, but let's ignore it for now).

Here's the problem though. The Prime Minister has instantly flipped from being overly sceptical to overly naïve in a matter of months. He's been sitting at the metaphorical roulette wheel all night trying to calculate the odds, and finally with seconds to go before closing he puts all his money on one number. In short, he was forced into risky courses of action by his scepticism. At least that was the stated reason.

I think we've all been in similar situations, albeit with far less severe consequences. In my business it can often occur with the tricky bug.

Consistency Is a Virtue

Time-constrained decisions are a special case though. If you can take the time constraints out of the equation, people should be held to a level of consistency in their judgements about the information presented to them. Like I said above, I think it is acceptable to be either sceptical or naïve, as long as there is some degree of consistency.

When there is no such consistency, it is a pretty blatant indicator that other biases are at work. And that's another explanation for the Prime Minister's sudden change of heart on climate change, but that's for you to judge...


Posted by
2007-01-24 12:39:00 -0600

In the spirit of pseudo-philosophical ramblings, I offer nomenclature flavored with hazy analogy.

Instead of using "faith" as you do in your introduction, I propose "trust."

Faith seems more of a binary state -- either you believe or you do not believe. Worse, "faith" has become a loaded word lately, especially here in the US. It now often connotes a blind acceptance, with the implication that there is no evidence to form a basis for a given belief. Godless heathens like me dismiss most religious beliefs for this reason. The more extreme on the other side try to turn the term around, characterizing my belief in, say, evolution, as also (just) a matter of faith, implying that it is equally baseless.

Trust, on the other hand, seems more of a continuum. It connotes a sliding scale. We often use phrases like "he's pretty trustworthy" and "I can trust her on this." Further, the degree of trust in a given source is continually readjusted, based on observed results obtained from relying on this source. Referring to your comparison of the skeptical and the naive, I might stretch my analogy of a sliding scale to say that the latter have less fine granularity available to them. Think of the volume knob on a radio with a limited set of fixed positions. The ultimate in naivete comes when there are only two positions available on the dial.

It seems to me that another distinction can be drawn by examining your behavior when options are available. If you trust something, you'll still likely double-check it, or plan a fallback position, if these options aren't too expensive. If you have faith in something, however, you don't bother. This seems to equate faith with 100% trust.

(Please pardon my Americanisms: I could not make myself type "sceptical.")