What to Believe

What to Believe #

In order to make good decisions, people need to build an understanding of how the world works by gathering information. Luckily, it is getting easier and easier to do this as technology becomes more and more efficient at sending data around the world. Unfortunately, simply gathering information is not enough; one must also make sure that the information is of high enough quality to use in world modeling or decision making.

There are many reasons why information people share with each other may be misleading or wrong.

https://thedietwars.com/how-to-determine-who-is-a-legitimate-science-based-health-expert/

Stories #

To transfer information, people share stories with each other. These stories get etched in people’s minds as they hear them, and can get more deeply etched with repetition or other validation.

Many (technically most) stories are not accurate descriptions of the real, physical world.

Inaccurate stories can be useful. For instance, they can be “stepping stones to the truth”, like our story about how electrons are particles that orbit the nucleus of an atom (Bohr model). They can also provide practical ways to encourage a behavior; for instance, the story that virtuous people are rewarded in the afterlife. Or auervedic medicine promoting health.

Unfortunately, inaccuracy can also lead to bad decision making with serious consequences. To be safe, it would be ideal if we could clearly determine inaccuracy and weight that in our consideration of whether or not to use the story in informing our behavior.

Story Plausability vs Accuracy #

Firstly, it is important to know that plausability DOES NOT correlate well with accuracy, especially in complex systems. This is true even when the plausible theory holds in controlled experiments done in a simplified version of the complex system (e.g. a model organism). There are many, many more plausable statements about systems like our economy or the human body that there are statements which are actually true. Some random examples (NOT MY OPINIONS):

Health #

  • Any dietary sugar crystallizes in the blood, causing arterial damage.
  • Sugar has no harmful effects on the body when <50g/day are consumed.
  • Saturated fat hardens in the blood, clogging arteries.
  • Saturated fat does not clog arteries, it gets dissolved in blood just fine.

Economy #

  • Poor people are poor because they don’t work hard.
  • Poor people are poor because powerful people with malicious intent are taking their resources.
  • Poor people are poor because our economy makes it hard to increase your net worth.

Education #

  • Girls and boys have inherent (from birth) differences in what topics they are interested in, and therefore end up finding careers in.
  • Girls and boys have the same chance to become interested in something at birth, but are unevenly distributed in careers due to how they are raised.

More theories like this that were widely believed but are now disproven can be found on this Wikipedia page. Also this study of medical therapies.

Note that some stories can be completely true, but still not have the effect they imply. For instance, certain foods contain toxins that kill and/or mutate our cells. But this does not mean that we shouldn’t eat those foods (perhaps every food falls into this category?).

See https://superbowl.substack.com/p/causal-explanations-considered-harmful for a good exploration of misleading causal theories.

For each of the above statements/stories, there are groups of people who would find them plausible. Since some directly contradict each other, how are we to know which to believe?

The Scientific Method #

We can use scientific reasoning to falsify inaccurate stories! Ideally, we would do this for all information we incorporate into our decisions; if no one has tried to falsify a certain piece of information, then we probably shouldn’t use it for important decision making.

Unfortunately, science is really hard, so many falsification studies have not been done. Additionally, science doesn’t give hard yes-or-no answers - it gives probabilities (e.g. p-values) that something might be true given a long list of assumptions or constraints. Context is crucially important - studies generally are very narrow in the situations they apply to! In order to accurately incorporate a scientific finding into a justification for making a certain decision, the uncertainties in the study and the required context to apply it should be understood.

Oh yeah, also scientific studies can just be wrong due to errors in their execution, even if the experimenters themselves were unbiased. A good practical way to get around this problem is to read review articles instead of individual studies.

It is hard to do this, just like it is hard to come up with scientific findings in the first place. If we as a society deeply recognized this difficulty, it may give more respect to good scientific findings, and help us have more informed debated about what actions to take on big important issues. This recognition may also make people more patient and willing to embrace the difficulty, instead of looking for shortcuts.

(I have nothing against finding the most efficient ways to solve problems, except when these shortcuts are less rigorous or otherwise worse than harder solutions AND we pretend that they are just as good just because of their convenience).

See also:

People and Stories #

When it comes to deciding what stories are accurate, or worth etching deeply into your own mind, it is important to select stories which have had a lot of effort put towards their falsification, but still hold up. To verify this process, it is ideal to try your hardest to falsify the story in question yourself.

Looking at what others believe is a good way to find new stories, and also can be a good way to vet stories. It is crucial however to think about WHY believers trust the stories they do. If they have not actively tried to falsify their own stories, that is a red flag that their belief may not be sound. Evidence that people have tried this falsification includes:

  • Advertisement of conditions under which their story does not hold true.
  • Write ups of experiments that were done in the effort of falsification.

At a higher level, if the person in question has a history of changing their beliefs when shown new evidence, that is also a good sign.

For those people who have a long history of believing the same story, or have financial or social incentives to believe said story, be wary that their confirmation bias may blind them a bit to new information.

In extreme cases, people who do not try to falsify their own stories can become zealots; so blind to the limitation of their own beliefs that there is no practical difference between them and someone who is intentionally promoting a belief they know to be wrong (lying).

One thing to note is that people holding hard to old beliefs seems to be a natural thing for humans to do, and there are ways to exploit this without changing people’s minds:

  • VSauce reasoning video_
  • Antifragile discusses how it might be ideal to have many strongly supported ideas fight it out in e.g. a market to find the best ones and move the whole system forward optimally.

This is related to the idea of “science advances one funeral at a time”, which assumes that (1) famous accomplished sciences direct future research and (2) these scientists have entrenched beliefs. These things combined mean new research directions that do not fit into the beliefs of the accomplished scientists do not get funding/attention. It would be nice if this wasn’t as true, but at the least there is still advancement *shrug*.

See also the “Self Deception” section of Perspective.

Reasons People Share Ideas #

Thinking about why an idea is being shared might make your decision of whether to believe it or not easier. Note that all of these reasons are extremely important and enable our modern society; I’m not saying that if you are listening to e.g. an advocate you can just ignore everything they are saying.

  • Truth Sharing: Simply to share the truth to others, for them to do what they will with it. This type of sharing happens a lot in dense academic research papers that effectively just share numbers and contain very little interpretation. It’s easy to think you are doing this when you actually aren’t. Examples:
    • Michael Lustgarten’s YouTube Channel
  • Advocacy: To change the behavior of others. For instance, sharing pictures of smoker’s lungs to get people to stop smoking. Or to get atheists to accept God.
    • Public health initiatives
    • COVID vaccine advertisements
  • Brainstorming/Theorizing: To get others to think about a problem or model of the world that might be useful or inaccurate. Partially in the hopes that others will find the model useful, and partially in the hopes that others will improve or soundly reject the model.

See also “The Enigma of Reason” for an excellent theory about how people reason to defend theories, not to generate them.

“Experts” #

Some fields, like chess, can legitimately have experts - people who deeply understand the problem space and make predictions about it very accurately. Other fields are less amenable to this - see Complex Systems.

Belief Matrix #

The following matrix provides a simplified view of how a person can stand on a single statement. The East/West axis describes how strongly the person agrees or disagrees with a statement. The North/South axis describes the level of evidence that person personally understands and accepts that supports the statement. Note that “scientific consensus” does not mean agreement of many trusted scientists, but rather the agreement of their work itself (e.g. compiled and reviewed by the person trying to form an opinion). Therefore, for example, if a person personally accepts strong evidence supporting the statement, yet disagrees with the statement, they must be lying.

Note that it is very difficult to know the extent of understanding a person has about an issue. Therefore, it can be hard to distinguish between liars and skeptics, or even zealots supporting the counter-idea and liars. At least we can make some assumptions about the popularity/availability of scientific studies; if an “expert” in a field ignores very available work, it’s more likely they are not an honest knowledge spreader.

                             Scientific Consensus
                             Understood by Person
             ----------------------------------------------------
             |                         |                        |
             |  A Liar or propagandist |  Someone spreading     |
             |  with hidden motives    |  real information      |
             |  (good or bad).         |                        | 
             ------------------A Plausible Idea------------------
             |                         |                        |
Vocal        |    A skeptic            |  A zealot or person    | Vocal
Disagreement |                         |  with blind belief     | Agreement
             |                         |                        |
             ------------------A Plausible Idea------------------
             |                         |                        |
             | Someone spreading       | A liar or propagandist |
             | real information        | with hidden motives    |
             |                         | (good or bad).         |
             --------------------------------------------------
                            Scientific Rejection
                            Understood by Person

I personally believe the plausible idea (I haven’t looked for or found studies yet) that certain people can be more amenable to falling into a certain quadrant quickly when given some idea. And additionally, if someone is shown to be OK with lying about ideas in one area, they are probably more likely to do so in other areas as well. Probably then, if one wants to save time/energy/resources reviewing ideas, they could discount those that are supported by people who have lied or shown blind belief in other ideas.

However, if one has the time and wants the most complete picture, in my opinion it is worth examining ideas from these people, since (again, in my opinion) it is very unlikely that they are completely incorrect in all their opinions. In fact, some of the most novel, and therefore potentially useful, ideas can come especially from strong believers in yet-untested plausible ideas.







———————– OLD THOUGHTS BELOW HERE ——————————

One useful lens to look at this further is to think about where the shared information is ultimately coming from:

From Past Personal Experience #

What works #

Tinkerers (as described in the book Antifragile) are people who use feedback from a specific task to improve and eventually become experts at doing that task. For instance a self-taught woodworker will eventually learn how to create complex wooden structures based on past experience of how wood behaves when he or she does things to it. Or a chess player may learn that certain patterns of pieces are dangerous or safe over time after winning or losing many games.

Information derived from tinkerers is:

  • Often very practically useful
  • Rarely completely “wrong”, just maybe only right in a very narrow scope.

But has these problems:

  • Deeper theories or reasons behind why the information is what it is are often absent or unrecoverable. This limits the scope at which the information can be accurately applied.
  • Reproducing the logic behind the information can be very difficult or impossible, since it requires walking though the life experience of the tinkerer. This can make it easy for the tinkerer to hide bias.

Categories: Important, Understanding The World

Backlinks: Manifesto, The Enigma Of Reason, Why Greatness Cannot Be Planned, Aging Science, Sens, Perspective, Working With Complex Systems,