by Matt DeStefano
David Chalmers and I disagree about issues in philosophy of mind. Given that I believe David Chalmers is an expert, and I am not, shouldn’t I revise my belief to agree with Chalmers? After all, he probably knows better than I do.
A lot of attention in philosophy has been paid to peer disagreement, and how one ought to revise her view in light of knowing that an epistemic peer disagrees with her (an epistemic peer being someone with the same evidence and same reasoning abilities as you). There is an additional question about how we should respond when experts disagree with us.
There are some conditions under which disagreement might serve as a defeater for a position you hold. For instance, imagine that Katie and Andy are watching a horse race. Katie thinks that the horse named Ain’t Misbehavin won, while Andy thinks that Tango Goin Cash has won. (I found these names by Googling “horse names”, it was very entertaining.) Katie knows that Andy's vision is about as good as hers, was watching the race just as closely, and is her “peer” in the relevant sense. They are both equally likely to have made a mistake in declaring a winner. In this case, we might think that Andy’s belief that Tango Goin Cash won serves as a defeater for Katie’s belief that Ain’t Misbehavin won - and vice versa. At the very least, we might encourage Katie and Andy to suspend belief about what horse won (perhaps until the results are revealed by an official).
What about expert opinion? Suppose Katie is watching the race with Susan, an expert analyst, who has a proven track record of correctly guessing winners as they cross the finishing line. Susan has excellent vision, and has been watching horse races for her entire life. If Susan were to say that Tango Goin Cash has won, we might think that Katie ought to revise her view to agree with Susan, since she is an expert. At least, it seems she should be more inclined to revise her belief in this case than in her disagreement with Andy.
There are other cases where it seems we are justified in believing something based on expert opinion alone. For example, I am feeling sick and decide to go to four doctors to get different opinions about my ailment. Three of them tell me I have the flu, while one says I have allergies. Most people would say that I am justified in believing that I have the flu, based on the majority opinion of the medical professionals.
Consider philosophers who are not epistemic peers, such as David Chalmers and myself. He has been studying issues in philosophy of mind for far longer than I have, he is more knowledgeable about the relevant arguments, and more intelligent than I am. When I find out that David Chalmers disagrees with me about some important issue in the philosophy of mind, should I revise my belief according to his expertise? Intuitively, this seems like a rational course of action. However, there are plenty of experts who disagree with Chalmers. In fact, Chalmers’ positions are in some ways minority positions among relevant experts. Considering this, we might think that the proper way to decide what we are justified in believing in is to count the number of philosophers who hold each position, and whichever position the majority of philosophers believe is the one we are justified in holding. This is a truly ridiculous way to determine our beliefs, and we need a better way to understand the influence of expert opinion on our beliefs.
Philosophers have taken a number of different positions about how we ought to treat peer disagreement. Some have argued that one ought to “split the difference” between the two views, and either suspend judgment or come to a middle position. Others have argued that one should “stick to their guns”, and believe what they believe despite disagreement. These views do not easily capture how one ought to respond to expert disagreement.
One view, the Total Evidence View, presents a framework for handling disagreement as a form of evidence. Thomas Kelly has argued that peer disagreement should count as higher-order evidence, and we should revise our beliefs based on our total evidence. Roughly, this position argues that “Rather, what it is reasonable to believe depends on both the original, first-order evidence as well as on the higher-order evidence that is afforded by the fact that one’s peers believe as they do.” (p. 32 of linked paper)
We can extend this “Total Evidence View” to our own disagreement with experts. This is useful in understanding why I might be permitted to believe (and even be justified in believing) my diagnosis on the basis of a majority medical opinion, but not permitted to believe (and certainly not justified in believing) what the majority of philosophers believe. My first-order evidence in the case of my illness are my symptoms, and a very limited idea of what these symptoms might entail. My higher order evidence is that these doctors are much, much better at diagnosing illnesses (this might also be first-order evidence), and that a majority of them seem to agree that I have the flu.
On the other hand, I have much more evidence for my positions in philosophy of mind. While I do not have the same breadth of evidence, or ability to discern what position it supports as Chalmers might have, I have enough to form a reasonably justified opinion about these matters. In this case, expert opinion should not weigh nearly as heavily in determining my beliefs as it did in the diagnostic case. If we consider disagreement as higher-order evidence, we can much better decide how to handle disagreement among both peers and experts. It also eases my personal discomfort about disagreeing with Chalmers.
Matt DeStefano
Sacramento State Philosophy Alumn
&
Philosophy Graduate Student
University of Missouri, St. Louis