The Blind Leading The Rationally Irrational

In a recent post Constant explored the problem of identifying competence. He observed that some people were probably simply not up to the task:

I happen to think that many people are not competent, and that consequently they rely on false authority - they are the blind led by the blind.

I think this is undoubtedly true, but I also think that that the truly incompetent are a small subset of those following the blind. Most people are capable of more competence than they typically demonstrate in many areas.

Constant touches on the prospect of bootstrapping competence:

There are ways to "bootstrap" competence. But these employ a certain kind of competence. For instance, you need to have the competence to distinguish the cases where a specific kind of bootstrapping succeeds from the cases where it fails. (So the thoroughly incompetent are thoroughly screwed. The competent may, of course, guide them, but so may anyone else.) I happen to think that many people accept authorities without the benefit of proper "bootstrapping" - possibly as a result of tragically mistaken "bootstrapping" based on false signs of competence.

Again, I think that those who tragically fail in a honest effort to identify competence certainly exist, but they also are a small subset of those following the blind.

Most people tend to demonstrate competence only when they bear the costs of their own incompetence. In politics, religion, and many other areas people typically do not pay significant costs for incompetence and irrationality. So they often don't even bother to attempt to identify competence because there is little reason why they should.

Why, for instance, should the average man in the street bother to invest the effort necessary to identify the competent experts on global warming? Will the individual get a better climate if he correctly identifies the competent experts? No. Will he get a better public policy? Almost certainly not. So why bother?

We all have biases. There is a certain psychic cost to giving up your biases. Even when such costs are not large they can easily outweigh the microscopic benefits of developing competence in politics. So people quite reasonably prefer their irrational biases to competence in politics.

This is the greatest social problem humanity faces. And it is almost completely overlooked.

Share this

I wouldn't say overlooked

What you have there is basically a massive n-player prisoner's dilemma. Its a problem thats been worked on for a while and the only solution people have found is the one you discussed -- Make people responsible for their actions.

Which is why...

...libertarians prefer markets to government since markets tend to impose the costs of the individual's views on the individual.

But how do you get from here to there? Movement libertarians wake up every day and go out and continue to lobby and attempt to persuade the populace completely overlooking this problem.

To me it's clear that they will never get from here to there by collective politics, but even if you could you would have to start by studying this problem deeply and tailoring your approach to directly address this problem. That isn't done because the problem is overlooked.

Reducing the cost

There have been many attempts, sometimes successful, to reduce the cost to people of sorting out reality from fiction. Courts try cases so that each member of the public doesn't have to. Universities identify competent professors so that parents don't have to and award credentials to their students so that prospective employers can more easily distinguish the better prospects. Prize committees select the best of the best periodically so that the rest of us don't have to. Newspapers tell us what really happened so we don't have to rely on gossip. Various government agencies also take it upon themselves to identify and spread the truth about this and that.

But just reading that list you might despair that if these are what we rely on then we are lost. While all of these have been helpful at one time or another in the past, they are a bit more fallible than we might be comfortable with, and even worse, they all, even the best ones, have a tendency to be captured by individuals and groups who seek to propagate their own views.

The common flaw may be that these are all entities - individuals or groups - who are implicitly asking the rest of us to trust them. Any such entity immediately becomes a target for capture by interested parties. But is there any better alternative? Arthur describes some tools, but these seem to take a good amount of effort to use. And if somebody says, "don't worry, I will use these tools and then I will tell you whom to trust," then the problem of capture resurfaces.

It doesn't really matter how much you reduce the cost...

...of identifying competence in contexts where there is no cost to the individual for incompetence. As in collective politics.

Cost of incompetence

What about you and me? Or if not us then surely some people make a genuine effort to sort truth from fiction. To such people being mistaken is undesirable and therefore is a cost. People surely fall somewhere along a range between those willing to go to great lengths to get things right and those who effectively do not care. If the costs are reduced then more people in the middle of that range will get things right.

My own concern is not primarily political. I would personally benefit from an enlargement of the community of serious truth seekers regardless of whether it remains politically invisible. And I myself would ideally like to sort out truth from fiction without devoting every waking moment to it. My own development so far has been enormously expensive to me.

You and I are outliers...

...and we were probably born that way. It is not a question of intelligence or any earned virtue we are just constitutionally less attached to our own biases then the overwhelming majority of individuals.

Almost everyone becomes more truth-seeking when their life depends on it and almost nobody becomes more truth-seeking when there is no tangible benefit.

We happen to be wired so that psychic rewards of epistemic rationality happen to be greater than or at least competitive with the psychic costs of discarding bias. This wiring is simply rare among human beings.

You must have noticed how difficult it is to get most people, even highly intelligent people, to abandon obvious fallacies when there is no personal cost to them. Yet there is tiny minority of people who seem to do it comparatively effortlessly.

There certainly are learned skills involved and it certainly takes effort to improve those skills but the fact remains that most people are not wired to engage those skills as easily as you, despite how otherwise well equipped they may be.

So if by serious truth seekers we mean those who engage in epistemic rationality when there is little tangible benefit I think you can only find such individuals, you can't fruitfully recruit them via persuasion from the general population, and you will only ever find them in very small numbers.

What kind of competence?

What kind of competence? Physical competence? Intellectual competence? Emotional competence? Social competence?

You can't "talk" someone into being competent in any way. The overreliance on talking and political persuasion is a dead giveaway for incompetence, in terms of bringing about meaningful helpful change. (credit to Patri)

You all need concrete projects to work on besides BS ing. Something relating to the ultimate meaning of this blog. You need to be doing something with measurable progress. Talking is just another way of killing time.

Doing depends on thinking

You all need concrete projects to work on besides BS ing. Something relating to the ultimate meaning of this blog. You need to be doing something with measurable progress. Talking is just another way of killing time.

It's nothing new. I've heard it called praxis by leftists:

Praxis is the process by which a theory, lesson, or skill is enacted or practiced.

and

The concept of praxis is important in Marxist thought. In fact, philosophy of praxis was the name given to Marxism by 19th century socialist Antonio Labriola. Marx himself stated in his Theses on Feuerbach that "philosophers have only interpreted the world in various ways; the point is to change it." Simply put, Marx felt that philosophy's validity was in how it informed action.

So the Marxists went ahead and did praxis, and turned much of the world into a Marxist hellhole. A key element of praxis is confidence that you are right - in order to act, you need to have decided. Thought comes before decision, action comes after. The time for thinking is over! The time for action has begun! - that sort of thing.

They might have benefited from a lot less action and a lot more thought. The world might have benefited. Praxis claimed billions of victims.

The overreliance on talking and political persuasion is a dead giveaway for incompetence, in terms of bringing about meaningful helpful change.

Political persuasion is not the only function of talking. Talking also plays a role in thinking. People who think a great deal often talk to themselves - i.e. verbal thought may play a large role in thinking (along with visual thought etc.). For instance, scientists write down symbolic equations, and then transform those equations by rules. An equation is as much symbolic communication as a sentence, and can be translated into a sentence (though an extremely difficult one to parse - hence the advantage of symbolic notation).

Arguments for Political Theory X are not merely "persuasion", they can also function as elements of cooperative thinking about Political Theory X. If your goal is persuasion, then powerful arguments against X are as deleterious to your goal as arguments for X are beneficial to your goal. But if you are seriously thinking about X, then powerful arguments against X are as welcome as arguments for X, as they form the basis of thought about X.

You may view this blog as propaganda "BS", as a purely verbal and therefore defective attempt to implement libertarianism, as verbal praxis, but I view it as part of a serious discussion. There is always a place for reflection, for thought, even while, elsewhere in the world, praxis is happening. Thinking, and therefore talking, are not just a way of killing time.

And yes, I know that praxis can be viewed as an experimental phase of thought. We can view the history of the Soviet Union as, in effect if not intent, one large experiment in Marxism. But there's plenty of room for non-experimental phases of thought, and we all would have been better off if Marxists had taken the course of engaging others in open discussion as opposed to the course of implementing their ideas as is.

I'm talking about Epistemological Competence

I'm talking about the grounding of theory in reality, the ability to discern truth.

Michael Huemer explains:

Instrumental rationality (or “means-end rationality”) consists in choosing the correct means to attain one’s actual goals, given one’s actual beliefs. This is the kind of rationality that economists generally assume in explaining human behavior.

Epistemic rationality consists, roughly, in forming beliefs in truth-conducive ways—accepting beliefs that are well-supported by evidence, avoiding logical fallacies, avoiding contradictions, revising one’s beliefs in the light of new evidence against them, and so on. This is the kind of rationality that books on logic and critical thinking aim to instill.

The theory of Rational Irrationality holds that it is often instrumentally rational to be epistemically irrational. In more colloquial (but less accurate) terms: people often think illogically because it is in their interests to do so. This is particularly common for political beliefs. Consider one of Caplan’s examples. Footnote If I believe, irrationally, that immigrants are no good at running convenience marts, I bear the costs of this belief—e.g., I may wind up paying more or traveling farther for goods I want. But if I believe—also irrationally—that immigrants are harming the American economy in general, I bear virtually none of the costs of this belief. There is a tiny chance that my belief may have some effect on public policy; if so, the costs will be borne by society as a whole (and particularly immigrants); only a negligible portion of it will be borne by me personally. For this reason, I have an incentive to be more rational about immigrants’ ability to run convenience marts than I am about immigrants’ general effects on society. In general, just as I receive virtually none of the benefit of my collecting of political information, so I receive virtually none of the benefit of my thinking rationally about political issues.

The theory of Rational Irrationality makes two main assumptions. First, individuals have non-epistemic belief preferences (otherwise known as “biases”). That is, there are certain things that people want to believe, for reasons independent of the truth of those propositions or of how well-supported they are by the evidence. Second, individuals can exercise some control over their beliefs. Given the first assumption, there is a “cost” to thinking rationally—namely, that one may not get to believe the things one wants to believe. Given the second assumption (and given that individuals are usually instrumentally rational), most people will accept this cost only if they receive greater benefits from thinking rationally. But since individuals receive almost none of the benefit from being epistemically rational about political issues, we can predict that people will often choose to be epistemically irrational about political issues.

There may be some people for whom being epistemically rational is itself a sufficiently great value to outweigh any other preferences they may have with regard to their beliefs. Such people would continue to be epistemically rational, even about political issues. But there is no reason to expect that everyone would have this sort of preference structure. To explain why some would adopt irrational political beliefs, we need only suppose that some individuals’ non-epistemic belief preferences are stronger than their desire (if any) to be epistemically rational.

And my supposition is that those who exhibit general epistemic rationality/competence do not typically have especially high preferences for truth, instead they have unusually low preferences for their own biases. As a matter of temperament they are simply less emotionally invested in their own views.

"Rational Irrationality" is

"Rational Irrationality" is a junk term. It's contradictory.