Epistemology is that part of philosophy that asks "what can we know?" "What can we be sure of?" "How do we get beyond mere opinion to real knowledge?"
Traditionally, there are two approaches to epistemology: rationalism, which says we gain knowledge through reasoning, and empiricism, which says we gain knowledge through sensory experience. Although there are a few extremist philosophers, generally most agree that both these approaches to knowledge are needed, and that to some extent they support and correct each other. More on that in a moment.
Rationalists focus on what they call necessary truth. By that they mean that certain things are necessarily true, always, universally. Another term that means the same thing is a priori truth. A priori is Latin for "beforehand," so a priori truth is something you know must be true before you even start looking at the world the senses reveal to us.
The most basic form of necessary truth is the self-evident truth. Self-evident means you don’t really even have to think about it. It has to be true. The truths of mathematics, for example, are often thought of as self-evident. One plus one equals two. You don’t need to go all over the world counting things to prove this. In fact, one plus one equals two is something you need to believe before you can count at all!
(One of the criticisms that empiricists would put forth is that “one plus one is two” is trivial. It is tautological, meaning it is true, sure, but not because it is self-evident: It is true because we made it that way. One plus one is the definition of two, and so with the rest of mathematics. We created math in such a way that it works consistently for us!)
Other self-evident truths that have been put forth over the years include “you can’t be in two places at once,” “something either is or it isn’t,” “everything exists.” These are pretty good candidates, don’t you think? But often, what is self-evident to one person is not self-evident to another. “God exists” is perhaps the most obvious one -- some people disagree with it quite vigorously. Or “the universe had to have a beginning” -- some people believe it has always been. A familiar use of the phrase “self-evident” is Thomas Jefferson's use of it in the Declaration of Independence: “We hold these truths to be self-evident: That all men are created equal....” But it is pretty obvious to most that this is not, really, true. Instead, it is a rhetorical device, that is, it sounds good to put it that way!
In order to reason our way to more complex knowledge, we have to add deduction (also known as analytic truth) to the picture. This is what we usually think of when we think of thinking: With the rules of logic, we can discover what truths follow from other truths. The basic form of this is the syllogism, a pattern invented by Aristotle which has continued to be the foundation of logic to the present day.
The traditional example is this one, called modus ponens: “Men are mortal. Socrates is a man. Therefore Socrates is mortal.” If x, then y (if you are human, then you are mortal). X (you are human). Therefore, y (you are mortal). This result will always be true, if the first two parts are true. So we can create whole systems of knowledge by using more and more of these logical deductions!
Another syllogism that always works is in the form “If x, then y. Not y. Therefore not x.” If you are human, then you are mortal. You are not mortal. Therefore, you are not human. If the first two parts are true, then the last one is necessarily true. This one is called modus tollens.
On the other hand, there are two examples that don’t work, even though they sound an awful lot like the ones I just showed you: If x, then y. Not x. Therefore not y. If you are human, then you are mortal. You are not human. Therefore you are not mortal. That, of course, would come as a big surprise to animals! Or look at this example: “If God would show himself to me personally, that would prove the truth of religion. But he hasn’t done so. Therefore, religion is false.” It sounds like a reasonable argument, but it is not. (This is called denial of the antecedent.)
Another goes like this: If x, then y. Y. Therefore x. If you are human, then you are mortal. You are mortal. Therefore you are human. Or try this one: “If God created the universe, we would see order in nature. We do in fact see order in the universe -- the laws of nature! Therefore, God must have created the universe.” It sounds good, doesn’t it? But it is not at all logical: The order in the universe could have another cause. (This is called affirmation of the consequent.)
There are many types of rationalism, and we usually refer to them by their creators. The best known, of course, is Plato’s (and Socrates’). Aristotle, although he pretty much invented modern logic, is not entirely a rationalist -- he was also interested in the truths of the senses. The most magnificent example of rationalism is Benedict Spinoza’s. In a book called Ethics, he began with one self-evident truth: God exists. By God, he meant the entire universe, both physical and spiritual, so his truth does seem pretty self-evident: Everything that is, is! But from that truth, he carefully, step by step, reasons his way to a very sophisticated system of metaphysics, ethics, and psychology.
Now let’s turn to empiricism. Empiricism focuses, logically enough, on empirical truth (also known as synthetic truth), which we derive from our sensory experience of the world.
Many people think that empiricism is the same thing as science. That is an unfortunate mistake. The reason that empiricism is so closely tied in our minds to science is really more historical than philosophical: After many centuries of religious rationalism dominating European thinking, people like Galileo and Francis Bacon came out and said, hey, how about paying some attention to the world out there, instead of just trying to derive truth from the scriptures? The stage for this change in attitude was, in fact, already set by St. Thomas Aquinas, who at least felt that scriptural truth and empirical truth need not conflict!
The simplest form of empirical truth is that based on direct observation -- taking a good hard look. Now this is not the same as anecdotal evidence, such as “I know a guy who has a cousin in Topeka who married a woman whose college roommate saw a UFO.” It’s not really even the same as “I saw a UFO.” It means that there is an observation that I made that you can make, too, and that, were it possible, everyone should be able to make. In other words, here’s a UFO: Take a look!
(Rationalists would argue, of course, that we could very well ALL be having an hallucination!)
In order to build a more complex body of knowledge from these direct observations, we must make use of induction, also known as indirect empirical knowledge. We take the observations and carefully stretch them to cover more ground than we could actually cover directly. The basic form of this is called generalization. Say you see that a certain metal melts at a certain temperature. In fact, you’ve seen it many times, and you’ve shown it to others. At some point, you make the inductive leap and say “the melting point of this metal is so many degrees.” Now it’s true that you haven’t melted every bit of this metal in the universe, but you feel reasonably confident that (under the same conditions) it will melt at so many degrees. That’s generalization.
You can see that this is where statistics comes in, especially in a more wishy-washy science like psychology. How many observations do you need to make before you can comfortably generalize? How many exceptions to the desired result can you explain away as some sort of methodological error before it gets to be too much? What are the odds that my observation is actually true beyond these few instances of it?
Just like there are different styles of rationalism, there are different types of empiricism. In this case, we have given them some names. Most empirical approaches are forms of epistemological realism, which says that what the senses show us is reality, is the truth.
The basic form of realism is direct realism (also known as simple or “naive” realism -- the latter obviously used by those who disagree with it!). Direct realism says that what you see is what you get: The senses portray the world accurately. The Scottish philosopher Thomas Reid is the best known direct realist.
The other kind is called critical (or representative) realism, which suggests that we see sensations, the images of the things in the real world, not the things directly. Critical realists, like rationalists, point out how often our eyes (and other senses) deceive us. One famous example is the way a stick jutting out of the water seems to be bent at the point it comes out. Take it out of the water, and you find it is straight. Apparently, something about the optics of air and water leads to this illusion. So what we really see are sensations, which are representations of what is real. Descartes and Locke were both critical realists. So are the majority of psychologists who study sensation, perception, and cognition.
But, to give Reid his due, a direct realist would respond to the critical realist that what we call illusions are actually matters of insufficient information. We don’t perceive the world in flash photos: We move around, move our eyes and ears, use all our senses.... To go back to the stick, a complete empirical experience of it would include seeing it from all directions, perhaps even removing it. Then we will see not only the real stick, just as it is, but the laws of air-water optics as well! A modern direct realist is the psychologist J. J. Gibson.
There is a third, rather unusual form of empiricism called subjective idealism that is most associated with Bishop George Berkeley. As an idealist in terms of his metaphysics, he argued that what we see is actually already a psychological or mental thing to begin with. In fact, if we don’t see it, it isn’t really there: “To be is to be perceived” is how he put it. Now, this doesn’t mean that the table you are sitting at simply ceases to be when you leave the room: God’s mind is always present to maintain the table’s existence!
There is this famous question: "If a tree falls in the woods, and there is no one there to hear it, does it make a sound?" The subjective idealist answer is yes, it does, because God is always there.
Another way to look at these three empirical approaches is like this: Critical realism postulates two steps to experiencing the world. First there is the thing itself and the light or sounds (etc.) it gives off. Second, there is the mental processing that goes on sometime after that light hits our retinas, or the sound hits our eardrums. Direct realism says that the first step is enough. Subjective idealism says that the second step is all there is.
(An old story tells about three baseball umpires bragging about their abilities. The first one says "I call 'em as I see 'em!" The second one says "Well, I call 'em as they are!" And the third one says "Shoot, they ain't anything till I call 'em!" The first is a critical realist, the second a direct realist, and the third is a subjective idealist.)
As I said at the beginning of this section, rationalism and empiricism don’t really have to remain antagonistic, and in fact they haven’t. It could even be said that science is a very well balanced blend of the two, where each serves, like the branches of government, as a check and balance to the other.
The traditional, ideal picture of science looks like this:
Let’s start with a theory about how the world works. From this
theory we deduce,
using our best logic, a hypothesis, a guess, regarding what we
will
find in the world of our senses, moving from the general to the
specific.
This is rationalism. Then, when we observe what happens in the
world
of our senses, we take that information and inductively support
or
alter our theory, moving from the specific to the general. This
is
empiricism. And then we start again around the circle. So
science
combines empiricism and rationalism into a cycle of progressive
knowledge.
Now notice some of the problems science runs into: If my theory is true then my hypothesis will be supported by observation and/or experiment. But notice: If my hypothesis is supported that does not mean that my theory is true. It just means that my theory is not necessarily wrong! On the other hand, if my hypothesis is not supported, that does in fact mean that my theory is wrong (assuming everything else is right and proper). So, in science, we never have a theory we can say is unequivocally true. We only have theories that have stood the test of time. They haven’t been shown to be false... yet!
This is one of the things that most people don’t seem to understand about science. For example, people who prefer creationism over evolution will say that, since evolution is “only a theory,” then creationism is just as legitimate. But evolution has been tested again and again and again, and the observations scientists have made since Darwin have held up tremendously well. It's like saying that a thoroughbred race horse is "just a horse," and therefore any old nag is just as good!
On the other hand, creationism fails quickly and easily. Carbon dating shows that the world is far older than creationists suggest. There are fossils of species that no longer exist. There is a notable lack of fossils of human beings during the dinosaur age. There are intermediate fossils that show connections between species. There are examples of species changing right before our eyes. There is a vast body of related knowledge concerning genetics. But with every piece of evidence shown to the creationists, they respond with what the logicians call an ad hoc argument.
An ad hoc argument is one that is created after the fact, in an attempt to deal with an unforeseen problem, instead of being a part of the theory from the beginning. So, if there is a rock that is too old, or a fossil that shouldn’t be, the creationist might respond with “well, God put that there in order to test our faith,” or “the days in Genesis were actually millions of years long” or “mysterious are the ways of the Lord.” Obviously, creationism is based on faith, not science.
Science is always a work in progress. No one believes in evolution, or the theory of relativity, or the laws of thermodynamics, the same way that someone believes in God, angels, or the Bible. Rather, we accept evolution (etc.) as the best explanation available for now, the one that has the best reasoning working for it, the one that fits best with the evidence we have. Science is not a matter of faith.
Science is, of course, embedded in society and influenced by culture and, like any human endeavor, it can be warped by greed and pride and simple incompetence. Scientists may be corrupt, scientific organizations may be dominated by some special interest group or another, experimental results may be falsified, studies may be poorly constructed, scientific results may be used to support bad policy decisions, and on and on. But science is really just this method of gaining knowledge -- not knowledge we can necessarily be certain about, but knowledge that we can rely upon and use with some confidence. For all the negatives, it has been the most successful method we have tried.
© Copyright 1999, C. George Boeree. All rights reserved.