by Kippy Myers
I suppose that if someone believes a right or a true thing, they will benefit from that belief because it is true and by its very nature truth carries its own blessings and hardships. But why do they believe it? Precisely what sort of facts, emotions, information, ignorance, passion, gullibility, or rationality actually led them to acquiesce and embrace it? If a person has weird reasons, bad reasons, or just plain dumb reasons for accepting something to be the truth, are they just as well off as someone else who has come to accept the same truth only as the result of carefully investigating, gathering facts and arguments on both sides of the issue, and subjecting their conclusion to the tools of logic? Of course, if the two people are subjected to rigorous questioning, the more studied believer will come out much better since they have reasoned their way to the belief and know why they accept it and can give studied answers when challenged, whereas the person who accepts the same belief for less studied or unstudied or perhaps even emotional/psychological factors will eventually hit a wall as he or she weakens under the weight of opposition. This point relates to why we believe lots of things about life, social structures, politics, and religion. For example, why do you believe that a certain medication is not slowly killing you? Why do you believe that your sports team is the greatest? Why do you believe that God exists?
Which is Better? Knowledge or Luck?
So is it better to be correct about something because you know it via the relevant facts and arguments, or is it better to be correct about something because you merely want it to be true (or for similar reasons)? Edmund Gettier offers a thought experiment where Mr. Jones looks out at a field and says that there is a sheep there. In reality, what has convinced Jones that a sheep is in the field is that he has seen a rock out there that looks like a sheep. Thus, he is wrong based on his evidence. But he is correct based on the facts because unbeknownst to Jones, there is in fact a sheep in another section of the pasture of which he is unaware. So when Jones says, “There is a sheep in the pasture,” he is factually correct, but his belief is not based on his having accurate information. The truth of his statement happens only by a stroke of luck. His belief turns out to be accurate, even if his reasons for believing it are misdirected.
Therefore, what’s the difference between believing something that turns out to be true in fact, though you did not believe it because of the facts? Is that any better or worse than someone who believed that there was a sheep in the pasture because he or she had indeed seen a sheep there? Socrates says that if you turn out to be correct by chance, then that is not “due to human direction” and therefore your belief is not based in reason or knowledge (Meno 99a) and is thus qualitatively inferior. It would be like comparing a woman who wins a game by skill to a woman who wins a game by luck. Either can win and either winner gets the prize. But which one will be more respected? Which one did more? Which one deserves to win more? The one who did the work, or the one who won by sheer luck? In this case, the result is the same, viz., a number in the win column. So we must wonder which is greater, knowledge or a lucky guess? Maybe it makes little or no difference in some game scenarios. But what about in cases that deal with matters of ultimate concern? Like with the existence of God, discovering the proper need and means of salvation, or the identity of the church that is directly connected to the Bible?
True Beliefs, False Beliefs, and Knowledge
Simply believing something to be true has in itself no “truth making power.” Just because you think X to be true does not make it so. In the Meno, Socrates distinguishes between true opinions and false opinions. A false belief is something that you accept as true, but turns out to be false. So let’s say that a true belief is something that you accept to be true and turns out to be true. Neither opinions nor beliefs are the same thing as knowledge (as Socrates sees it). Therefore, he reasons that mere true opinions/beliefs are not considered to be knowledge any more than false opinions are. But they can in many cases be just as useful. Thus, a true belief can be the precursor of true knowledge. That is, once a true belief is tested or maybe becomes grounded in some legitimate proof, it is then knowledge. Or as Socrates puts it, true opinions “can be aroused by questioning and turned into knowledge” (86.a). But he values proven beliefs (knowledge) over simple beliefs. He says, that knowledge is “more valuable than right opinion” (98.a).
Maybe you are in the car with someone who asks you where Mrs. Jones lives. You give the location that you believe or opine is accurate, 721 Pine Drive. But when you arrive at 721 Pine Drive, you realize that this is not the home of Mrs. Jones. You thought it was, but you were wrong. You had a false belief. You felt certain that it was true, but when the facts emerged, you discovered your error. Facts are stubborn things and when brought to bear on beliefs will on occasion prove those beliefs to be either true or false. However, let's say that someone asks you for directions to Mr. Franklin's house and you tell them the directions that you think are correct and, as it turns out, are indeed correct. But you don’t know they are correct before you follow them and see that they got you to the proper location. Prior to your arrival at Mr. Franklin's house, you only opined that your directions were accurate. Thus, as Socrates points out, even if you don’t know the proper directions to Mr. Franklin's house but you think you do, your directions will get the person to the correct location anyway because your belief turn out to be a true belief. Once you arrived at the house and had a certain kind of proof, only then could you know that your belief was true. That was the moment when true belief became knowledge. That’s the same as saying that a true belief will regularly cash out with the same results as knowledge. But you didn’t (in the language of Socrates) know the location because you had no empirical or objective proof that undergirded or completely substantiated your belief. You merely had an opinion about it that turned out to be factually true. While you might have had some type of reason (however slight) for believing it to be true, as it turns out you were correct through no particular virtue of your own since the belief was not based in fact or in personal knowledge. In such cases, true opinions function just as well as knowledge. The two may be equally useful, but unequally “valuable.”
Now then, if Mr. Benjamin believes that God exists but is unable to provide you with any reasons for his belief, I think that Socrates would say that Benjamin has a true opinion, but not knowledge. That is, he doesn’t know that God exists, but he has an opinion that He does. And yet, would not his true opinion please God as well as the knowledge of a Christian philosopher who has spent decades studying the pros and cons of a string of theistic arguments that demonstrate God’s existence? Which of the two would be more invincible to attacks from an unbeliever? Mr. Benjamin or the philosopher? Additionally, why are some of us satisfied with our own lack of evidence for believing that God exists, that the Bible is God’s word, and that Jesus Christ is God’s Son? I think that atheist Christopher Hitchens had a legitimate point when he said in an interview on CSPAN, “If I were to come to you and say, ‘Hey, I’m willing to believe a lot of extremely important claims without any evidence at all’ now, will you respect me?” If we have no good reasons for our own beliefs, how can we expect to convince other folks to accept our belief system?
Of course, we might feel absolutely certain about our beliefs even if their grounding is unstable or possibly unsubstantiated. But certainty is a personal feeling of assurance and is not necessarily connected to truth outside one’s mind. Maybe we could say that a person’s feeling of certainty is not proportionate to the truth of that person’s belief. Have you ever been absolutely certain that your view or belief was correct and yet it turned out that you were absolutely wrong? I have. Thus, that internal feeling is not a measure of knowledge or of truth. Yet, we can only act on those matters that we are certain about. We just need to be certain that our certainty is based on solid ground rather than pride, wishful thinking, or passions. I agree with philosopher Karl Popper’s statement, “We must distinguish between truth, which is objective and absolute, and certainty, which is subjective.”
It seems to me that it is each person’s responsibility as a free moral agent to garner sufficient reasons for his or her most important beliefs (existence of God, meaning of life, church, salvation, marriage). What is sufficient for one person might not be for another. What is sufficient for one person might be merely subjectively sufficient while objectively insufficient (e.g., a hate monger might latch onto any information that to him “proves” that races other than his are inferior). But for the less important beliefs (favorite sports team, brand of toothpaste, choice of computer) this is not quite as significant. Thus, we need to talk not only about what we believe, but why we believe it and the interplay of those two. But we can’t have that conversation right here right now.
Phillip E. Johnson has had an illustrious career in the study and practice of law. He is a graduate of Harvard School of Law and served as a law professor at University of California at Berkeley for thirty years. In his little book Objections Sustained, Johnson includes a chapter entitled “Harter’s Precept” that I have found to be enlightening and challenging for my life, and particularly applicable in the disciplines of education, science, and religion.
Johnson begins the chapter with an intriguing statement. “I am convinced that conscious dishonesty is much less important in intellectual matters than self-deception.” Let that rattle around in your head for a while. Which is worse? Knowing that you’re being dishonest, or not knowing that you’re being dishonest? He thinks that given a specific situation, being fully aware that you are being dishonest is somehow not as significant as simply convincing yourself that you are right and proceeding in that ignorance to perhaps teach falsehoods or practice wickedness but with a good conscience and thus, a good motive. In Galatians 1:13, the apostle Paul says that at one time in his pre-Christian days he had persecuted the church and tried to destroy it. But he was a well-meaning persecutor because he thought that he had the truth at the time and hence carried out his opposition to the church with a clear conscience (cf. Acts 24:18). At the time it seems that he was certain that he was doing the right thing in persecuting the church. Later, of course, after he realized that he had been certainly wrong, he made a dramatic 180-degree turn and became a valiant and often persecuted defender of the faith. So his former false belief was in reality hurting him and others even though he lived out that belief to the best of his ability and with a clean conscience. But it is a different scenario when you know that you are wrong, and are aware that you are lying to yourself and others, but you enable yourself through self-deceit to continue doing that (a term that needs to be analyzed further).
A good Logic class will help us to see through faulty argumentation and misdirection. But a great deal of genuine self-examination is required in order to be a truth seeker (cf. 2 Thessalonians 2:10) because our motives are sometimes askew (because of factors such as pride, lust, power, wealth, fame) and lead us in wrong directions, away from the good, the right, the true, and the beautiful in life. Johnson quotes physicist Richard Feinberg who said, “The first principle is that you must not fool yourself, and you are the easiest person to fool.” It’s easier for you to fool you. Think about it.
If you know that you are wrong, you are free to commit the wrong act anyway, in spite of your full awareness that it’s wrong. However, there might be situations where you would choose not to commit that wrong act. Your conscience could trouble you and hinder your wrong actions. But if you allow yourself or perhaps enable yourself to believe that you are right, your conscience can reach the point where you are actually emboldened by it to move forward, even if you are factually and morally wrong (cf. Proverbs 30:20).
Johnson tells the story of biologist Bruno Muller-Hill who believed that “self-deception plays an astonishing role in science in spite of all the scientists’ worship of truth.” A physics teacher wanted his class (that included Muller-Hill) to see a planet through a telescope. The first boy came to the telescope and looked into it. The teacher asked if he could see anything. The boy was nearsighted and had trouble seeing anything through the device. The teacher quickly adjusted the focus and this enabled the student to see the planet. “Yes, I see it!” Then the rest of the students took turns and no one had a problem seeing the planet. In time, the student just in front of Bruno in line (a lad named Harter) came to the telescope and looked inside. He couldn’t see a thing. It was all black, he told his teacher. The teacher called him an idiot and demanded that he adjust the focus. Still Harter claimed that he could see nothing through the telescope. Finally, the teacher looked for himself, only to discover that the student was correct. There was a cover over the lens. None of the students had seen a thing, but only one of them was willing to say so.
According to Johnson, “Harter’s Precept says that the way to advance in academic life is to learn to see what you are supposed to see, whether it is there or not.” Yes, indoctrination happens at all levels. Even in graduate school. Even in science (see Thomas Kuhn’s classic The Structure of Scientific Revolutions). Even in religion. Even in atheism. We are indoctrinated in many ways, not all of them bad.
When we consider what we opine versus what we know, how we know what we know, why we think we know some things that we don’t, the role of the individual and his or her passions and prejudices in it all, and the many ways that we are indoctrinated, it should lead us to humility, to respect for those who can consider new and different ideas, to appreciation for the testing of our beliefs, and to further consideration of the impact of continued study of our own personal ideals.
I’ve said too much, and there’s so much more to say.